Computing Technology for Math Excellence Logo

Math Topics

Learning Support

Professional

Black line

Education Research:
Conducting Research and Education Research on the Web

Internet World

This section on Education Research includes supporting pages of resources associated with State and National Standards.

Arrow: You are hereResearch Resources (Page 1 of 2) includes:

Research Resources (Page 2 of 2) includes summaries of selected research and resources related to:

Adobe Reader GifGet Adobe Acrobat Reader, free software for pdf files, which appear on this page.

Black line

Scientifically Based Research and Beyond

 

Learn more about education and research.

Friendly reminder GifThe Glossary of Key Terms in Educational Research by Abdullah Noori (2021, February 23) will "help novice researchers in understanding basic research terminologies in educational research. It provides definitions of many of the terms used in the guidebooks to conducting qualitative, quantitative, and mixed methods of research" (p. 2).  There are over 50 pages of terms.

Mathematica "studies public and private efforts to improve the quality of education and competitiveness of our workforce. [Its] work helps advance the science of education research and further discourse about education reform."  Several areas of expertise are highlighted in this focus area, including college and career readiness, effective data use, literacy and numeracy, post secondary education, school choice and charters, school reform, special education, STEM, strengthening and disseminating research, teacher and principal effectiveness, teacher and principal preparation and support.

 

NCLB and the What Works Clearinghouse

During the era of No Child Left Behing (NCLB), "To access much of the federal funding allocated through NCLB, states and districts [were] required to adopt programs and policies that [were] supported by scientifically based research."  Teachers were expected "to adapt their practice to reflect the competencies necessary to implement the new programs" (Laitsch, 2003, para. 2).  North Central Regional Educational Laboratory (2004) discussed and provided examples of the six components of scientifically based research (SBR).  SRB must:

  1. Use empirical methods.
  2. Involve rigorous and adequate data analyses.
  3. Rely on measurements or observational methods that provide reliable and valid data.
  4. Use either an experimental or quasi-experimental design.
  5. Allow for replicability.
  6. Undergo expert scrutiny. (pp. 3-4)

The U.S. Department of Education set up the What Works Clearinghouse (WWC) in 2002 to provide easily searchable databases containing such scientific evidence.  During its first year, the WWC focused on seven topics: interventions for beginning reading; curriculum-based interventions for increasing K-12 math achievement; preventing high school dropout; increasing adult literacy; peer-assisted learning in elementary schools for reading, mathematics, and science gains; interventions to reduce delinquent, disorderly, and violent behavior in middle and high schools; and interventions for elementary school English language learners.

The WWC defined an educational intervention as a product, practice, policy, or program that has been shown to be effective.  "An intervention is effective if it improves outcomes relative to what would have been seen without the intervention" (WWC Glossary).  Note that the WWC Glossary is a beneficial resource for consumers to better understand terms used in education research.

In Determining 'What Works,' Therese Mageau (2004) of THE Journal and Dr. Grover 'Russ' Whitehurst, who was director of the Institute of Education Sciences at that time, discussed the question of effectiveness.  When the WWC was started, it gave preference to studies involving randomized trials because randomized trials are the gold standard for determining effectiveness.   Whitehurst commented, however, that "there are two prongs to evidence-based practice" (p. 34).  Something that may work in one location might not work in another.  Therefore, evidence-based practice includes the scientific studies of effectiveness as found on the WWC, and the integration of professional wisdom based on "locally collected performance data that indicates whether changes are occurring in the desired direction when a particular program or practice is implemented" (p. 34).

In June 2010, the WWC expanded its acceptable research standards to include non-experimental single-case design and regression-discontinuity studies, documentation of which can be found at the WWC website.  The WWC Procedures and Standards Handbook Version 5.0, released in 2022, is now available for reviewing studies.  The document includes the major technical and procedural changes between version 5.0 and the prior version 4.1 of the handbook.

ESSA and Beyond

Fortunately, the Every Student Succeeds Act of 2015 recognizes other forms of research.  This law, which replaces NCLB, "requires that states and districts use evidence-based interventions to support school improvement" (Dynarski, 2015, para. 1; Slavin, 2015; 114th Congress, 2015).  ESSA defines four categories.  Evidence-based per ESSA means: "an activity, strategy, or intervention" that "demonstrates a statistically significant effect on improving student outcomes or other relevant outcomes" based on 1. strong, 2. moderate, or 3. promising evidence.  Strong evidence comes from "at least 1 well-designed and well-implemented experimental study;  moderate evidence from at least 1 well-designed and well-implemented quasi-experimental study; or promising evidence from at least 1 well-designed  and well-implemented correlational study with statistical controls for selection bias."  Evidence-based can also mean 4. a program that "demonstrates a rationale based on high quality research findings or positive evaluation that such activity, strategy, or intervention is likely to improve student outcomes or other relevant outcomes; and includes ongoing efforts to examine the effects of such activity, strategy, or intervention." (114th Congress, 2015, p. S.1177-290)

In order for educators to take advantage of results of studies, employing their professional wisdom, it is important for educators to read and analyze professional journals and documents from research organizations reporting on results of studies.  According to Simpson, LaCava, and Graner (2004), educators should be able to clearly identify the following:

In 2017 the Center for Research and Reform in Education (CRRE) at Johns Hopkins University developed Evidence for ESSA.  This free website is designed to help educators make informed decisions regarding selecting K-12 programs (e.g., in math and reading) that meet the strong, moderate, and promising evidence criteria per the ESSA.  "It identifies for you the level of evidence under ESSA that is associated with a given program, provides you with a snapshot of what the program looks like and costs, identifies the grades, communities, and children included in the program’s evaluations, and points you in the direction of more information about the program, its evaluations, and implementation" (About section).  Per Robert Slavin (2017) of the CRRE, the developers of this site view it as a supplement to the WWC, not a competitor, and note that the WWC was not designed to serve evidence standards per the ESSA.  For example, the WWC does not provide the "promising" category (Slavin, 2017).

"Unfortunately, "[e]vidence-based practices are not being widely adopted in classrooms," according to David Andrews (2012, p. 5) of Johns Hopkins University School of Education. There's much to consider.  "When deciding whether to adopt an evidence-based approach, educators should weigh the costs and benefits, and be prepared to implement it with fidelity. ... Fidelity means implementing the approach exactly as it was developed and tested" (p. 4), which is not an easy task.  Andrews provided some key questions to consider when conducting a cost-benefit analysis, as costs are both direct and indirect.  Sustainability of the innovation is also an issue, as enthusiasm for the innovation must be maintained.

Mark Dynarski (2015) elaborated on a two-stage method for Using research to improve education under the Every Student Succeeds Act, which takes into consideration how education programs and practices that have been shown to be effective can be implemented in local contexts.

Educators might require professional development on conducting research and implementation of new curricula.  CT4ME's Education Research section will give you a good start on this topic, as well as how to conduct your own action research.

 

HOT NEWS

Man reading document on fireThe Education Sciences announced on July 11, 2007 that the U.S. Department of Education awarded Mathematica Policy Research, a five-year contract to take over management of the What Works Clearinghouse (Viadero, 2007).  The WWC got off to a rocky start in identifying high-quality research that met its standards.  As of Viadero's report, the WWC web site listed just "74 reviews of research on reading instruction, dropout prevention, teaching English-language learners, and other topics" (para. 5).  In 2007, it was hoped that this change would make the WWC more responsive and relevant to the needs of educators, as there was a call for "expanding the range of research designs that qualify as sound evidence, and introducing practical guides and other products that educators and decision makers in the trenches might see as useful" (Viadero, 2007, para. 9).

Indeed the WWC has become valuable to educators.  Since 2007, Mathematica (formerly known as Mathematica Policy Research) has played a central role in administering the WWC, "including expanding the resources available to educators, revising standards and authoring new standards for reviewing education research, and exploring an array of topics that are at the forefront of education decision making" (Mathematica, 2020,  Project Overview section).

Even better news--

In August 2013, the Institute of Education Sciences and the National Science Foundation released their joint report: Common Guidelines for Education Research and Development. The document describes "shared understandings of the roles of various types or "genres" of research in generating evidence about strategies and interventions for increasing student learning. ... [and] describes the agencies’ expectations for the purpose of each type of research, the empirical and/or theoretical justifications for different types of studies, types of project outcomes, and quality of evidence" (p. 7).  Six types of research are examined:

Foundational research "provides the fundamental knowledge that may contribute to improved learning and other relevant education outcomes."

Early-stage or exploratory research "examines relationships among important constructs in education and learning to establish logical connections that may form the basis for future interventions or strategies to improve education outcomes. These connections are usually correlational rather than causal."

Design and development research "develops solutions to achieve a goal related to education or learning, such as improving student engagement or mastery of a set of skills." Research projects may include pilot tests.

Efficacy research "allows for testing of a strategy or intervention under "ideal" circumstances, including with a higher level of support or developer involvement than would be the case under normal circumstances."

Effectiveness research "examines effectiveness of a strategy or intervention under circumstances that would typically prevail in the target context."

Scale-up research "examines effectiveness in a wide range of populations, contexts, and circumstances, without substantial developer involvement in implementation or evaluation." (pp. 8-9)

 

Back to top

 

Efforts to Determine Product Effectiveness

 

Evidence for ESSA: Evidence Based Math Programs includes several math programs meeting ESSA evidence requirements.  As of 2024, 41 programs are listed, 21 of which have a strong evidence rating.

Digital Promise (2015) developed Evaluating Studies of Ed-Tech Products, a tool to "help district leaders evaluate studies on ed-tech product effectiveness in order to decide whether it is necessary to run a pilot" (Purpose section).  There are 12 questions within four sections: "product information, study relevancy, study source, and study design" (Instructions section).  

What Works Clearinghouse (WWC) includes intervention reports on multiple math programs.  Evidence ratings include Positive or Potentially Positive, Mixed Effects/No Discernible, Not Applicable, Negative or Potentially Negative, and No Evidence.  The following are illustrations of those:

 

HOT NEWS

Paper on fire for hot news

The U.S. Department of Education announced a study on technology's role in raising achievement on February 13, 2004, which would be of interest to mathematics educators.  The study used a random-assignment design to evaluate 16 computer-based reading and math products over three years to determine the effectiveness of technology in bolstering student achievement.  Mathematica Policy Research, Inc. and SRI International were involved in the study to assess the effectiveness of learning technology in teaching reading in grade 1, reading comprehension in grade 4, pre-algebra in grade 6, and algebra in grade 9.

On April 4, 2007, the U.S. Department of Education released its report for Congress, Effectiveness of Reading and Mathematics Software Products: Findings from the First Student Cohort (Dynarski, et al., 2007).

A key finding indicated, "Test scores were not significantly higher in classrooms using the reading and mathematics software products than those in control classrooms. In each of the four groups of products - reading in first grade and in fourth grade, mathematics in sixth grade, and high school algebra - the evaluation found no significant differences in student achievement between the classrooms that used the technology products and classrooms that did not."

The sixth grade math products included "Larson Pre-Algebra (published by Houghton-Mifflin), Achieve Now (published by Plato), and iLearn Math (published by iLearn)."  The algebra products included "Cognitive Tutor Algebra (published by Carnegie Learning), Plato Algebra (published by Plato), and Larson Algebra (published by Houghton-Mifflin)."

Read this full report: https://ies.ed.gov/ncee/pdf/20074005.pdf

The study received immediate reaction from leaders around the country concerned about the effectiveness of technology in education and results of this study.  However, readers are cautioned about making quick decisions about technology effectiveness based on the results of this study.  Dynarski and colleagues (2007) had pointed out, the study "was not designed to assess the effectiveness of educational technology across its entire spectrum of uses, and the study’s findings do not support conclusions about technology’s effectiveness beyond the study’s context, such as in other subject areas" (p. xiv).

Read one such reaction: Guest Editorial: More Questions than Answers: Responding to the Reading and Mathematics Software Effectiveness Study (Fitzer, Freidhoff, Fritzen, et al., 2007).

Also see Effectiveness of Reading and Mathematics Software Products: Findings from Two Student Cohorts (February 2009): https://ies.ed.gov/ncee/pubs/20094041/.  According to this follow-up to the 2007 study, "The first-year report presented average effects of four groups of products on student test scores, which supported assessing whether products were effective in general. School districts and educators purchase individual products, however, and knowing whether individual products are effective is important for making decisions supported by evidence. This report presents findings on the effects of 10 products on student test scores." (Executive Summary section)

 

 

Back to top

 

Reading and Conducting Research

 

Those who are not familiar with reading and conducting research studies will appreciate a few tips and documents on this topic. Commentary on and additional resources for conducting Action Research are also presented.

Reading Research Studies

A Policymaker's Primer on Education Research: How to Understand It, Evaluate It, and Use It (February 2004) is by Patricia Lauer at Mid-continent Research for Education and Learning.  The Primer is intended to help readers understand what education research says, whether it's trustworthy and what it means for policy.  Readers will also learn some of the technical statistical and scientific concepts touched upon in research reports and gain a deeper understanding of education research methodology.  Practical tools, including a flowchart for analyzing research and an understanding statistics tutorial, are included.

How Can You Tell When the Findings of a Meta-Analysis are Likely to be Valid (Blog Post, 2020, October 15) by Robert Slavin contains tips on how to quickly evaluate a meta-analysis in education.  The first of those is to look at the overall effect size.  If it is more than about +.40, you probably don't need to read further.  "A very large effect size is almost a guarantee that a meta-analysis is full of studies with design features that greatly inflate effect sizes, not studies with outstandingly effective treatments" (para. 4).  The Methods section should list the criteria for the selection of the studies used in the analysis.  Slavin noted five of those, which are the main ones used at Evidence for ESSA (see the full 2023 list in its Standards and Procedures: Inclusion Criteria for Studies).

How to Read a Research Study: The R3I Method by Carrie Conaway (2020, December 11).  In her blog post at the Center for Research Use in Education, Conaway describes her R3I method: "relevance, inference, impact, and importance."

How to Read a Research Study Article, posted at the College of DuPage Library, provides advice on how to read a research study article.  A research article generally is structured with an Abstract, Introduction, Methods, Results, Discussion, and References.  Each section has specific content, which he summarizes.  It will take more than one reading to fully understand the bulk of the research, but the general start is to read the abstract, the first paragraph or so of the introduction and the hypothesis, then skip to the discussion to find how the study turned out. Go back to read the sections on methods focusing on how the hypotheses were tested, results, and re-read the discussion section.  Finally, read the entire report again from first page to last for greater understanding.

Identifying and Implementing Educational Practices Supported by Rigorous Evidence: A User Friendly Guide (December 2003) from The U.S. Department of Education will help educators determine whether an educational intervention is supported by rigorous evidence.  It contains a three-step evaluation process, a checklist to use in the process, definitions of research terms, and what to look for in research studies.

What Works Clearinghouse Glossary is intended to help users to understand the terms used in education research so that they can be more effective consumers of such research.

Statistics in Research

The following online texts are for anyone who really wants to understand the statistics involved in research:

HyperStat Online Statistics Textbook by D. Lane also contains links to free statistical analysis tools and instructional demos.

Introductory Statistics: Concepts, Models, and Applications by D. W. Stockburger.  This Web Edition also includes many examples showing how to use SPSS to do statistical procedures.

Introductory Statistics is a free text from OpenStax, operating out of Rice University.  This one-semester course is mainly for students who are not majoring in math or engineering.  "It focuses on the interpretation of statistical results, especially in real world settings."

Statistics Every Writer Should Know by Robert Niles.  Learn about mean, median, percent, per capita, standard deviation, margin of error, data analysis, and more.  Link to sites for data sources and interactive help to select the right statistical test.

SticiGui by P. B. Stark is an online introductory statistics text.  According to Stark, materials "include interactive data analysis and demonstrations, machine-graded online assignments and exams (a different version for every student), and a text with dynamic examples and exercises, applets illustrating key concepts, and an extensive glossary."

Visual ANOVA, posted at Wolfram Demonstrations, is a tool meant to direct your attention to relationships among the components of ANOVA (Analysis of Variance) by representing them visually.  It is designed to help you understand the F test.

Conducting Research Studies

In general, research studies involve qualitative, quantitative, or mixed methodologies.  Research is conducted to solve problems or contribute to their solutions.  It involves identifying and justifying an issue you wish to investigate, which is framed as a problem statement.  To better understand the problem, you gather and align resources (e.g., people, policies, data, what others have done in regard to the problem, etc.) and information to study the issue (a review of published literature on the problem).  A plan (methodology) is formed to conduct the research and then it is carried out.  The research is analyzed to develop a solution.  Recommendations are made for creating an action plan to implement ideas and results of the research.  Finally, results and recommendations are released, perhaps via a presentation or published document.

Citations are generally included in research.  If web resources are cited, over time those resources and links to them might no longer be available or might have changed.  To address this issue, those who create scholarly works might consider using Perma.cc.  This free tool, created at the Harvard Library Innovation Lab of Harvard Law School Library, is used to archive pages from original links.  By giving Perma.cc the URL of the page you wish to preserve, the software will visit the site and create a unique URL that you add to your citation, thus ensuring readers can access content from the original source cited.

Guidelines for Conducting and Reporting EdTech Impact Research in U.S. K-12 Schools (Newman, Jaciw, & Lazarev, 2017) is freely available.  The report elaborates on 16 guidelines of research best practices for publishers, developers, and service providers of K-12 edtech products and reflects changes brought about by passage of the Every Student Succeeds Act.  The guidelines are organized into four clusters (i.e., Getting Started, Designing, Implementing, and Reporting) and provide a perspective to consider if you are planning a research study.  They can also benefit those K-12 educators and the research community involved with selecting and implementing technology-based products and services.  The glossary of research terms is particularly beneficial to all.

 

Do you need to evaluate the effectiveness of educational technology products in your school?

Question markAlthough a product might appear to work in one district, it might not be suitable for use in another.  Be wary of ed tech company claims of the effectiveness of a product based on research, as the research might have been "shoddy" (Mathewson & Butrymowicz, 2020).  Districts are encouraged to conduct their own research.  Mathewson and Butrymowicz listed the following organizations that have begun to assist districts in this endeavor:

Star with word Free on itThe U.S. Department of Education, Office of Educational Technology, in partnership with Mathematica, developed the Evidence to Insights Coach (e2i Coach, formerly ED TECH RCE Coach).  This web-based interactive tool is free.  It guides administrators through the process of conducting rapid-cycle evaluation research with a goal of providing evidence of effectiveness of a particular technology for continued use in the school or district, or for its best implementation.

Per Mathematica (2015-2018), e2i Coach guides users to follow five steps: get started, plan the research, prepare the data, analyze the data, summarize the findings.  The tool recommends the approach to evaluate technology, helps in determining the research question, includes steps for identifying data sources and creating a clean data file, analyzes the data and delivers results, and compiles results into a document that can be shared.

Are you planning to conduct research involving students?

Whether you are a classroom teacher or outside researcher, you'll get some guidance in PROOF POINTS: Why parental consent often isn't required in education research, which is a Hechinger Report by Jill Barshay (2021).  Per research exemptions noted by the Legal Information Institute at Cornell Law School, Barshay wrote:

"The main federal rule on protecting humans during experiments generally requires their consent or a parent’s consent when the human is under 18 but there is a major exception, written expressly for education.  Parental consent isn’t needed when the researcher is studying “normal” educational practices in a classroom that are “not likely to adversely impact students’ opportunity to learn.”  That includes “most research on regular and special education instructional strategies,” from evaluating the effectiveness of one approach to comparing different approaches with each other." (para. 2)

Further, institutional review boards generally follow the federal rules, even if the research is not federally funded.  "What frequently triggers parental consent is not the classroom intervention itself, but the tools used to evaluate its effectiveness" (Barshay, 2021, para. 9).  Parental consent would be needed for tools such as the researcher wanting to introduce an extra test, student completed surveys, student interviews, video and audio recordings and photographs of classroom activities during a study.  

 

American Evaluation Association identifies online resources of interest to anyone doing evaluations.  Their collection includes online multi-chapter handbooks and texts on various methodologies, software links for qualitative data analysis and developing and administering surveys, and more.

Brief Guide to Questionnaire Development by Robert Frary (2000, Office of Measurement and Research Service, Virginia Polytechnic Institute) is useful for collecting factual information and opinions in survey research.

Educational Research Basics by Del Siegle at University of Connecticut introduces several topics to beginning researchers: types of research, ethics and informed consent, single subject research, qualitative research, content analysis, historical research, action research, experimental research, statistical tests (e.g., correlations, t-tests, ANOVA, regression, Chi-Square) and much more.

International Journal of Qualitative Methods "(ISSN: 16094069) is a peer reviewed open-access journal of the International Institute for Qualitative Methodology at the University of Alberta, Canada, and its international affiliates. The journal "publishes papers that report methodological advances, innovations, and insights in qualitative or mixed methods studies; it also publishes funded full studies using qualitative or mixed-methods" (Home page description).

Multidisciplinary methods in educational technology research and development (2008), by Justus Randolph, is an open-source book published by Hämeenlinna, FI: HAMK Press.

Research Methods Knowledge Base is a fully hyperlinked online text by Dr. William M.K. Trochim of Cornell University.  "It covers the entire research process including: formulating research questions; sampling (probability and nonprobability); measurement (surveys, scaling, qualitative, unobtrusive); research design (experimental and quasi-experimental); data analysis; and, writing the research paper.  It also addresses the major theoretical and philosophical underpinnings of research including: the idea of validity in research; reliability of measures; and ethics."

Research Randomizer, part of the Social Psychology Network, is a free "quick way to generate random numbers or assign participants to experimental conditions. Research Randomizer can be used in a wide variety of situations, including psychology experiments, medical trials, and survey research."  This is a great tool for random sampling and random assignment, which includes tutorials.

The Survey System from Creative Research Systems, developers of survey software, includes a chapter "intended primarily for those who are new to survey research. It discusses options and provides suggestions on how to design and conduct a successful survey project."  It includes an overview of seven steps in designing a survey project.  Of particular value are the Research Aids: a sample size calculator, sample size formula, and discussions of significance and correlation.

Web Pages that Perform Statistical Calculations is a compilation of resources.

Survey Tools

The following are among the many tools for conducting surveys:

 

Back to top

 

Action Research

Action research "refers to a disciplined inquiry done by a teacher with the intent that the research will inform and change his or her practices in the future. This research is carried out within the context of the teacher’s environment—that is, with the students and at the school in which the teacher works—on questions that deal with educational matters at hand" (Ferrance, 2000, p. 7).  While it is often done by individual teachers, the research might also involve collaborative groups of teachers, or be school-wide or district-wide.  Any number of concerns might be addressed.  However, the overriding goal is to make things better and to improve the quality of educational outcomes.  For example, "[t]he teacher may be seeking solutions to problems of classroom management, instructional strategies, use of materials, or student learning" (p. 9).  Collectively, "a school may have a concern about the lack of parental involvement in activities, and is looking for a way to reach more parents to involve them in meaningful ways" or staff might wish "to examine their state test scores to identify areas that need improvement, and then determine a plan of action to improve student performance" (p. 10).

According to John Tillotson (2000), a reason why teachers lack interest in educational research is that many of the topics chosen for study seldom have direct implications for what happens in the classroom.  "One doesn't have to look far in most public schools today to find outdated teaching practices and assessment strategies, in spite of ample research findings that suggest more effective alternatives" (p. 31, par. 3).  He suggested that an expansion of action research at the K-12 level is a promising solution to the dilemma of research failing to inform practice.

Perhaps one of the reasons why action research might not be conducted is that it "has often been stereo-typed as the poor second cousin to "real" research, which is distinguished by quantitative methods and experimental designs" (Reeves, 2010, p. 5).  Yet, Douglas Reeves (2010) stated, "Experimental research and action research are not in opposition to one another; the latter applies the former to the real world of the classroom" (p. 5).

Action research, in this author's view, would satisfy a mandate for educators to employ researched-based instructional materials and methodologies in their instruction that get results.  It fits the exercise of using a district's professional wisdom based on "locally collected performance data that indicates whether changes are occurring in the desired direction when a particular program or practice is implemented" (Mageau, 2004, p. 34).  However, action research serves for the solution of a local problem (e.g., How can we make an effective change that would be beneficial here?) or provides a framework for exploration and would not necessarily be appropriate if a researcher wants to ultimately generalize the results from a question under investigation to other settings.  One also needs to keep in mind that research questions, goals, or purposes can evolve during action research.

Although action research is “usually qualitative and participatory” (Dick, 1997, para. 5), Bob Dick (1998a) acknowledges that “action research is not synonymous with ‘qualitative’ either” (para. 7).  Essentially, the action research cycle involves problem formulation, data collection, data analysis, reporting results, and action planning (Tillotson, 2000).  The research study has many cycles, the theories of which are intended to guide actions, such as: "In situation S, to produce outcomes O1, O2, ..., try actions A1, A2, ..."  Further, "[i]n specifying the important features of the situation it also allows its generalizability to be tested in other similar settings" (Dick, 1998b, para. 7).

Sometimes those who conduct action research have not employed its methodology correctly so that the researcher's plan might be replicated by others.  Poor planning might lead to disappointing results.  While action research "is not a substitute for quantitative research," it is a "contextual lens for other research."   As the teacher is the researcher, a limitation to action research is the teacher bias, and a "good deal of opinion and rhetoric" (Reeves, 2010, p. 75).  If the method is employed, Reeves' model of action research that was tested with 81 teams of teachers and administrators might be considered.  There are four elements (pp. 80-81):

  1. There should be a research question that links professional practice to student results.  (e.g., Reeves (2010, p. 80) suggested, "How will using interactive journals influence the writing performance of second-language students in a 7th grade math class?")
  2. The student population should be described (e.g., grade level, special characteristics, demographics and educational factors of participating students).
  3. Student achievement data should be collected (e.g., year-end tests, formative assessments, classroom observations, etc. to allow for systematic observations of changes in achievement).  Effectiveness is enhanced if there are several measurements throughout the project.
  4. The research plan should also contain specific professional practices to be observed (e.g., feedback intervals compared to number of questions posed on tests for a particular instructional strategy).  Reeves (2010) stated, "Ideally, the action researchers will create a scoring rubric with a range of performance over three or four different levels so that an objective observation can be made about the extent to which a particular practice was applied in the classroom" (p. 81).

The following resources provide more information on the process and examples from completed studies:

 

You don't need perfect research to support a proposed change!

Friendly reminder GifDouglas Reeves (2006) reminded us, "There are hardly any true randomly assigned groups in educational research, largely due to ethical constraints" (p. 97).  "The quality model that prevails throughout successful organizations is not waiting for perfection but rather 'Try it, test it, improve it.' " (p. 98).

Don't forget to teach your students how to conduct research!

Researching in a Digital World: How do I teach my students to conduct quality online research?Researching in a Digital World: How do I teach my students to conduct quality online research?, Erik Palmer (2015) presented a step-by-step guide to teach learners at all grade levels how to conduct more responsible research in an internet environment.

Research Process Tutorials from the New York City School Library System is one part of a collection of resources on information literacy.  Tutorials include such topics as getting started with research, choosing a topic, primary and secondary sources, identifying keywords and phrases, finding background information, evaluating information, scholarly articles, search engines vs. databases, and more.

Introduction to Research in the Classroom focuses on mathematics research and the Making Mathematics project from the Education Development Center, which was for grades 7-12.  Related to this is an abridged version of the longer FAQ list for Making Mathematics.

Read Patricia Deubel's article, Conducting Research-based Projects in Elementary Grades with Safety in Mind, published July 26, 2017 in THE Journal.

For additional resources, see our Technology Integration Resources: Building Internet Search and Citation Skills.

 

Back to top

 

Finding Education Research

 

Searching the Web with key phrases will yield some education research and there are scholarly search engines and academic databases that you can use, which CT4ME provides on our technology integration page for Building Internet, Search and Citation Skills.  In addition to resources below, journals are a good source for education research.  See the list of Journals at CT4ME.

Michael Gaskell (2024) identified the following 3 Useful AI Research Tools for Educators and discussed features of each:

 

Best Evidence Encyclopedia (BEE) "is a free website created by the Johns Hopkins University School of Education’s Center for Research and Reform in Education (CRRE). It is intended to give educators and researchers fair and useful information about the strength of the evidence supporting a variety of programs available for students in grades K-12.  The BEE mostly consists of systematic meta-analyses of research on effective programs in reading, mathematics, writing, science, early childhood education, and other topics. It also contains articles on review methods and on issues such as special education policy and evidence-based reform" (About section).  Mathematics educators will be particularly interested in mathematics programs that have shown evidence of helping elementary, middle and high school learners succeed, and the effectiveness of technology applications for enhancing mathematics achievement.

Center for Improving Learning of Fractions, administered at University of Delaware, focuses on improving math instruction for elementary and middle school children who have problems with math concepts, specifically fractions.  Research has been made possible via a grant from the Institute of Education Sciences.

Center for the Study of Teaching and Policy at the University of Washington "investigates efforts to improve the quality of teaching and learning, the teacher workforce, and the systems of support for teachers’ work, in various contexts and at multiple levels of the K-12 educational system."

HOT!: Center for Research Use in Education is devoted to "Rethinking Research for Schools" (R4S).  It focuses on narrowing the gap between research and practice.  You'll find numerous "fact sheets, infographics, user guides, and more on research utilization for practitioners, researchers, and everyone in between," and a library of publications with peer-reviewed research.  There's a section on Evidence and ESSA and a blog.

Center on Instruction includes STEM resources.  It offers materials and resources on mathematics to build educators’ knowledge of instruction for students with low achievement in mathematics, improve professional development models for math teachers, and build teachers’ skills in monitoring student growth toward important math outcomes.

Community for Advancing Discovery Research in Education (CADRE) "is a network for STEM education researchers funded by the National Science Foundation's Discovery Research PreK-12 (DRK-12) program. Through in-person meetings, a web site, common interest groups, newsletters, and more, CADRE connects these researchers who are endeavoring to improve education in science, technology, engineering and mathematics in, and outside of, our schools.  CADRE helps DRK-12 researchers share their methods, findings, results, and products inside the research and development community and with the greater public" (About CADRE section).

Digital Commons Network includes "free, full-text scholarly articles from hundreds of universities and colleges worldwide. Curated by university librarians and their supporting institutions, the Network includes a growing collection of peer-reviewed journal articles, book chapters, dissertations, working papers, conference proceedings, and other original scholarly work."

Digital Promise Research Map is a web tool to find education research dating back to 2005 from nearly 100,000 journal articles in education and the learning sciences.  There are four sections: Map, Topics, Ask a Researcher, and Videos.  The Map section provides two different viewing formats (chord, network), each visually showing interconnections among topics. From Ask a Researcher, educators can "get trusted, research-based answers to questions about real education challenges," including questions on math teaching and learning.  Answers are provided by experts at Harvard University's Usable Knowledge Project.  The Video collection includes videos showing key concepts in action (e.g., Improving Math Learning with Technology).  Topics include the following:

Directory of Open Access Journals aims to be comprehensive (i.e., all subjects and languages) and covers free, full text, quality controlled scientific and scholarly journals to guarantee content.  Quality control means that the journals "must exercise peer-review or editorial quality control to be included." This is a huge plus for researchers.

DREME: Development and Research in Early Math Education is an initiative of Stanford University.  Per its description, "The DREME Network was created in 2014 to advance the field of early mathematics research and improve young children’s opportunities to develop math skills. The Network focuses on math from birth through age eight years, with an emphasis on the preschool years. Network members and affiliates collaborate to conduct basic and applied research and develop innovative tools that address high-priority early math topics and inform and motivate other researchers, educators, policymakers and the public" (About section).

Education Policy Analysis Archives is a peer-reviewed online journal of education research.

Education Resources Information Center (ERIC) is sponsored by the U.S. Department of Education, Institute of Education Sciences.  ERIC's search can be restricted to peer-reviewed only or full text articles.  Search by descriptors such as mathematics instruction, mathematics achievement, mathematics education, academic achievement, teaching methods, program effectiveness, and more.  You can search by source, author, publication date, publication type, education level, and audience. There is a Thesaurus that has multiple subcategories and related mathematical terms.

Evidence for ESSA is a free website from the Center for Research and Reform in Education at Johns Hopkins University.  Its purpose is to provide educators with the most up-to-date and reliable information regarding K-12 programs (e.g., in math and reading) that meet the strong, moderate, and promising evidence criteria per the ESSA.

Institute of Education Sciences is the primary research arm of the United States Department of Education. IES brings rigorous and relevant research, evaluation and statistics to our nation's education system.

International Association for the Evaluation of Educational Achievement (IEA) is an independent, international cooperative of national research institutions and governmental research agencies. IEA has conducted more than 23 research studies of cross-national achievement since 1958. Examples include Trends in Mathematics and Science Study (1995, 1999, 2003, 2007), the Progress in International Reading Literacy Study (2001, 2006), the TIMSS-R Video Study of Classroom Practices, information technology in education (SITES-M1, SITES-M2, SITES 2006), and the Teachers Education and Development Study in Mathematics (TEDS-M), initiated in 2005. Publications are available, as well as current studies (e.g., TIMSS Advanced 2008 and PIRLS 2011).

International TIMSS and PIRLS Study Center, located at Boston College, is the principle site for reporting on the Trends in Mathematics and Science studies and the Progress in International Reading Literacy studies.

Learning and Technology Library (formerly EdITLib, Education & Information Technology Library): LearnTechLib, sponsored by the Association for the Advancement of Computing in Education, is a digital library of "peer-reviewed research on the latest developments and applications in Learning and Technology." There are "100,000+ documents of published international journal articles, conference papers, e-books, and multimedia content from 200,000+ leading authors" (About, Content section).

Learning Policy Institute conducts research to improve education policy and practice.  You'll find reports on multiple topics, such as assessment, deeper learning, educator quality, social and emotional learning, teacher preparation and professional learning, whole child education and more.

MERLOT provides peer-reviewed online teaching and learning materials in numerous categories.  Education, and mathematics/statistics are among those.

National Assessment of Educational Progress (NAEP), the Nation's Report Card" is the only nationally representative and continuing assessment of what America's students know and can do in various subject areas. Assessments are conducted periodically in mathematics, reading, science, writing, the arts, civics, economics, geography, U.S. history and beginning in 2014, in Technology and Engineering Literacy" (About section).  Representative samples of students from grades 4, 8, and 12 are selected for main assessments, but not all grades are assessed each time.

National Center for Education Research (NCER) is one of four centers within the Institute of Education Sciences from the U.S. Department of Education.  "NCER research programs address education programs, practices, and policies in reading and writing, mathematics and science education, teacher quality, education leadership, education policy and finance, cognition and student learning, high school reform, and postsecondary education" (About Us section).  For a list of the Research and Development Centers, see: https://ies.ed.gov/ncer/research/randdCenters.asp

National Center for Education Statistics is the primary federal agency for collecting and analyzing data related to education. Publications and annual reports are available.

National Council of Teachers of Mathematics: News (also Research and Advocacy) has summaries, clips, and briefs connecting math education research to the classroom.  You'll also find a list of books on research.

Rand Education and Labor, a division of the Rand Corporation, includes numerous research areas with in-depth content available, which you can access via a drop-down menu at the top of home page.  For example, core research areas include Education Technology and Personalized Learning, Social and Emotional Learning, K-12 Accountability and Assessments, and more.

Review of Educational Research is a journal that "publishes critical, integrative reviews of research literature bearing on education, including conceptualizations, interpretations, and syntheses of literature and scholarly work in a field broadly relevant to education and educational research" (About section).

Society for Research on Educational Effectiveness (SREE) advances and disseminates "research on the causal effects of education interventions, practices, programs, and policies."  SREE sponsors the Journal of Research on Educational Effectiveness.

What Works Clearinghouse: Easily find what works among the many topics addressed and within publications and reviews.

 

Are you interested in the "No Significant Difference Phenomenon"?

Question markReaders might be interested in The No Significant Difference website, which is a companion to Thomas L. Russell's book, The No Significant Difference Phenomenon: A Comparative Research Annotated Bibliography on Technology for Distance Education (2001, IDECC, fifth edition).  The book is a research bibliography documenting no significant difference in student outcomes based on the mode of education delivery (face to face or at a distance).  Note: by distance, we mean instruction delivered via media such as radio, television, video, online and other technologies, which have been used historically and in current studies.  In addition to studies that document no significant difference, the website includes studies which do document significant differences in student outcomes based on the mode of education delivery. Studies are listed from 1928 to the present with a search engine to find those of particular interest.

Are you interested in the current and future federal role in education research?

The Aspen Institute. (2013, December 3). Leveraging learning: The evolving role of federal policy in education research. Washington, DC: Author. https://www.aspeninstitute.org/publications/leveraginglearning/ 

This document contains a "series of essays, infographics, and briefs that outline the current federal landscape of education R&D and envision its future." It begins with an essay titled A Brief History of Federal Efforts to Improve Education Research.  The section on the Current Federal Role in Education Research identifies research and development centers throughtout the U.S. and also provides an overview of investing in innovation.  Highlights within the section on the Future of the Federal Role in Education Research include Why We Need a DARPA for Education (ARPA-ED) and New Directions in Education Research: The Federal Role.

 

Back to top | Education Research: Page 1  |  2

 

References

114th Congress of the United States. (2015). Every Student Succeeds Act https://www.ed.gov/laws-and-policy/laws-preschool-grade-12-education/every-student-succeeds-act-essa

Andrews, D. (2012, Winter). In search of feasible fidelity. Better: Evidence-based Education. https://web.archive.org/web/20210417063127/http://www.betterevidence.org/us-edition/issue-9/in-search-of-feasible-fidelity/

Barshay, J. (2021, June 21). PROOF POINTS: Why parental consent often isn't required in education research. The Hechinger Report. https://hechingerreport.org/proof-points-why-parental-consent-often-isnt-required-in-education-research/   

Dick, B. (1997). What is "action research"? Occasional pieces in action research methodology, # 2. http://www.aral.com.au/arm/op000.html

Dick, B. (1998a). Rigour (1). Occasional pieces in action research methodology, # 13. http://www.aral.com.au/arm/op000.html

Dick, B. (1998b). Grounded theory (2).  Occasional pieces in action research methodology, # 17. http://www.aral.com.au/arm/op000.html

Digital Promise. (2015). Evaluating studies of ed-tech products. https://digitalpromise.org/2015/12/04/how-strong-is-the-evidence-a-tool-to-evaluate-studies-of-ed-tech-products/

Dynarski, M. (2015, December 10). Using research to improve education under the Every Student Succeeds Act. https://www.brookings.edu/research/using-research-to-improve-education-under-the-every-student-succeeds-act/

Dynarski, M., Agodini, R., Heaviside, S., Novak, T., Carey, N., Campuzano, L., et al. (2007). Effectiveness of reading and mathematics software products: Findings from the first student cohort. (Publication No. 2007-4005). U.S. Department of Education Institute of Education Sciences. https://ies.ed.gov/ncee/pdf/20074005.pdf

Edwards, L. (2024, March 20). Perplexity AI: How to use it to teach. Tech & Learning. https://www.techlearning.com/how-to/perplexity-ai-how-to-use-it-to-teach

Edwards, L. (2024, March 26). Elicit: How to use it to teach. Tech & Learning. https://www.techlearning.com/how-to/elicit-how-to-use-it-to-teach

Ferrance, E. (2000). Action research.  Northeast and Islands Regional Educational Laboratory at Brown University: Themes in Education series. https://www.brown.edu/academics/education-alliance/sites/brown.edu.academics.education-alliance/files/publications/act_research.pdf

Fitzer, K. M., Freidhoff, J. R., Fritzen, A., Heintz, A., Koehler, J., Mishra, P., Ratcliffe, J., Zhang, T., Zheng, J., & Zhou, W. (2007). Guest editorial: More questions than answers: Responding to the reading and mathematics software effectiveness study. Contemporary Issues in Technology and Teacher Education, 7(2), 1-6. https://citejournal.org/volume-7/issue-2-07/editorial/article1-html-7/

Gaskell, M. (2024, March 19). 3 useful AI research tools for educators. Tech & Learning. https://www.techlearning.com/news/3-useful-ai-research-tools-for-educators

Institute of Education Sciences & National Science Foundation. (2013, August). Common guidelines for education research and development. Washington, DC: Authors. https://ies.ed.gov/pdf/CommonGuidelines.pdf

Laitsch, D. (2003, August). Into the mix: Policy, practice, and research. ASCD InfoBrief, Issue 34. Available in Archived Issues: http://web.archive.org/web/20210417082527/http://www.ascd.org/publications/newsletters/policy-priorities/aug03/num34/toc.aspx

Mageau, T. (2004, January). Determining 'What Works' - An interview with Dr. Grover 'Russ' Whitehurst. THE Journal, 31(6), 32-27. https://thejournal.com/Articles/2004/01/01/Determining-What-Works--An-Interview-With-Dr-Grover-Russ-Whitehurst.aspx

Mathematica. (2015-2018). Rapid-cycle tech evaluations accelerate decision making. https://mathematica.org/projects/rapid-cycle-tech-evaluation

Mathematica. (2020). What Works Clearinghouse: 2007-2020. https://mathematica.org/projects/what-works-clearinghouse

Mathewson, T., & Butrymowicz, S. (2020, May 20). Ed tech companies promise results, but their claims are often based on shoddy research. The Hechinger Report. https://hechingerreport.org/ed-tech-companies-promise-results-but-their-claims-are-often-based-on-shoddy-research/

Newman, D., Jaciw, A., & Lazarev, V. (2017, June 30). Guidelines for Conducting and Reporting EdTech Impact Research in U.S. K-12 Schools. Palo Alto, CA: Empirical Education Inc. https://www.empiricaleducation.com/research-guidelines/

North Central Regional Educational Laboratory. (2004). NCREL quick key 7: Understanding the No Child Left Behind Act of 2001: Scientifically based research. Naperville, IL: Learning Point Associates. https://files.eric.ed.gov/fulltext/ED518709.pdf

Reeves, D. (2006). The learning leader: How to focus school improvement for better results. Alexandria, VA: ASCD. Available: https://amzn.to/3Tno2FY

Reeves, D. (2010). Transforming professional development into student results. Alexandria, VA: ASCD. Available: https://amzn.to/3PswnqS

Simpson, R. L., LaCava, P. G., & Graner, P. S. (2004). The No Child Left Behind Act: Challenges and implications for educators. Intervention in School and Clinic, 40(2), 67-75. https://www.researchgate.net/publication/249832851_The_No_Child_Left_Behind_ActChallenges_and_Implications_for_Educators

Slavin, R. (2017, February 9). Evidence for ESSA and the What Works Clearinghouse. Huffpost. https://www.huffpost.com/entry/evidence-for-essa-and-the-what-works-clearinghouse_b_589c7643e4b02bbb1816c369

Slavin, R. (2015, December 8). Evidence and the ESSA. HuffPost. https://www.huffpost.com/entry/evidence-and-the-essa_b_8750480

Tillotson, J. W. (2000). Studying the game: Action research in science education. The Clearing House, 74(1), 31-34. https://www.researchgate.net/publication/234675303_Studying_the_Game_Action_Research_in_Science_Education

Viadero, D. (2007, July 12). U.S. switches contractors for "What Works" research site. Education Week, 26(43). https://www.edweek.org/ew/articles/2007/07/18/43whatworks.h26.html?tmp=541546481

 

Black line

Back to top | Education Research: Page 1  |  2

Binoculars GifGo to Related Topic:  State and National Education Standards and The Best Rated Standards Resources