This section on Education Research includes supporting pages of resources associated with State and National Standards.
Research Resources (Page 1 of 2) includes:
Research Resources (Page 2 of 2) includes summaries of selected research and resources related to:
Get Adobe Acrobat Reader, free software for pdf files, which appear on this page.
Get the latest reports on what's happening at the classroom, state, and national levels at the Center for Education Policy. You'll find reports of value in the categories for Common Core State Standards, Expanded Learning Time, Federal Education Programs, High School Exit Exams; Public School Facts, History, and Education Issues; State Testing Data and Student Achievement.
Readers might be interested in How and why standards can improve student achievement by Scherer (2001).
During the era of NCLB, "To access much of the federal funding allocated through NCLB, states and districts [were] required to adopt programs and policies that [were] supported by scientifically based research." Teachers were expected "to adapt their practice to reflect the competencies necessary to implement the new programs" (Laitsch, 2003, para. 2). North Central Regional Educational Laboratory (2004) discussed and provided examples of the six components of scientifically based research (SBR). SRB must:
The U.S. Department of Education set up the What Works Clearinghouse (WWC) in 2002 to provide easily searchable databases containing such scientific evidence. During its first year, the WWC focused on seven topics: interventions for beginning reading; curriculum-based interventions for increasing K-12 math achievement; preventing high school dropout; increasing adult literacy; peer-assisted learning in elementary schools for reading, mathematics, and science gains; interventions to reduce delinquent, disorderly, and violent behavior in middle and high schools; and interventions for elementary school English language learners.
The WWC defined an educational intervention as a product, practice, policy, or program that has been shown to be effective. "An intervention is effective if it improves outcomes relative to what would have been seen without the intervention" (WWC Glossary).
In Determining 'What Works,' Therese Mageau (2004) of T.H.E. Journal and Dr. Grover 'Russ' Whitehurst, who was director of the Institute of Education Sciences at that time, discussed the question of effectiveness. When the WWC was started, it gave preference to studies involving randomized trials because randomized trials are the gold standard for determining effectiveness. Whitehurst commented, however, that "there are two prongs to evidence-based practice" (p. 34). Something that may work in one location might not work in another. Therefore, evidence-based practice includes the scientific studies of effectiveness as found on the WWC, and the integration of professional wisdom based on "locally collected performance data that indicates whether changes are occurring in the desired direction when a particular program or practice is implemented" (p. 34). In June 2010, the WWC expanded its acceptable research standards to include non-experimental single-case design and regression-discontinuity studies, documentation of which can be found at the WWC website. The WWC Procedures and Standards Handbook Version 3.0, released in March 2014, is now in use for reviewing studies.
Fortunately, the Every Student Succeeds Act of 2015 recognizes other forms of research. This law, which replaces NCLB, "requires that states and districts use evidence-based interventions to support school improvement" (Dynarski, 2015, para. 1; Slavin, 2015; 114th Congress, 2015). ESSA defines four categories. Evidence-based per ESSA means: "an activity, strategy, or intervention" that "demonstrates a statistically significant effect on improving student outcomes or other relevant outcomes" based on 1. strong, 2. moderate, or 3. promising evidence. Strong evidence comes from "at least 1 well-designed and well-implemented experimental study; moderate evidence from at least 1 well-designed and well-implemented quasi-experimental study; or promising evidence from at least 1 well-designed and well-implemented correlational study with statistical controls for selection bias." Evidence-based can also mean 4. a program that "demonstrates a rationale based on high quality research findings or positive evaluation that such activity, strategy, or intervention is likely to improve student outcomes or other relevant outcomes; and includes ongoing efforts to examine the effects of such activity, strategy, or intervention." (114th Congress, 2015, p. S.1177-290)
In order for educators to take advantage of results of studies, employing their professional wisdom, it is important for educators to read and analyze professional journals and documents from research organizations reporting on results of studies. According to Simpson, LaCava, and Graner (2004), educators should be able to clearly identify the following:
"Unfortunately, "[e]vidence-based practices are not being widely adopted in classrooms," according to David Andrews (2012, p. 5) of Johns Hopkins University School of Education. There's much to consider. "When deciding whether to adopt an evidence-based approach, educators should weigh the costs and benefits, and be prepared to implement it with fidelity. ... Fidelity means implementing the approach exactly as it was developed and tested" (p. 4), which is not an easy task. Andrews provided some key questions to consider when conducting a cost-benefit analysis, as costs are both direct and indirect. Sustainability of the innovation is also an issue, as enthusiasm for the innovation must be maintained.
Mark Dynarski (2015) elaborated on a two-stage method for Using research to improve education under the Every Student Succeeds Act, which takes into consideration how education programs and practices that have been shown to be effective can be implemented in local contexts.
In 2017 the Center for Research and Reform in Education (CRRE) at Johns Hopkins University developed Evidence for ESSA. This free website is designed to help educators make informed decisions regarding selecting K-12 programs (e.g., in math and reading) that meet the strong, moderate, and promising evidence criteria per the ESSA. "It identifies for you the level of evidence under ESSA that is associated with a given program, provides you with a snapshot of what the program looks like and costs, identifies the grades, communities, and children included in the program’s evaluations, and points you in the direction of more information about the program, its evaluations, and implementation" (About section). Per Robert Slavin (2017) of the CRRE, the developers of this site view it as a supplement to the WWW, not a competitor, and note that the WWW was not designed to serve evidence standards per the ESSA. For example, the WWW does not provide the "promising" category (Slavin, 2017).
Educators might require professional development on conducting research and implementation of new curricula. CT4ME's Education Research section will give you a good start on this topic, as well as how to conduct your own action research.
HOT News: The Education Sciences announced on July 11, 2007 that the U.S. Department of Education awarded Mathematica Policy Research, a five-year contract to take over management of the What Works Clearinghouse (Viadero, 2007). The WWC got off to a rocky start in identifying high-quality research that met its standards. As of Viadero's report, the web site listed just "74 reviews of research on reading instruction, dropout prevention, teaching English-language learners, and other topics" (para. 5). In 2007, it was hoped that this change would make the WWC more responsive and relevant to the needs of educators, as there was a call for "expanding the range of research designs that qualify as sound evidence, and introducing practical guides and other products that educators and decision makers in the trenches might see as useful" (Mission Shift section).
Indeed the WWC has become more valuable to educators. Since 2007, Mathematica Policy Research (MPR) has continued to play an active role in administering the WWC, "including expanding the resources available to educators, revising standards and authoring new standards for reviewing education research, and exploring an array of topics that are at the forefront of education decision making" (MPR on the WWC, 2016, para. 2).
Even better news--
In August 2013, the Institute of Education Sciences and the National Science Foundation released their joint report: Common Guidelines for Education Research and Development. The document describes "shared understandings of the roles of various types or "genres" of research in generating evidence about strategies and interventions for increasing student learning. ... [and] describes the agencies’ expectations for the purpose of each type of research, the empirical and/or theoretical justifications for different types of studies, types of project outcomes, and quality of evidence" (p. 7). Six types of research are examined:
Foundational research "provides the fundamental knowledge that may contribute to improved learning and other relevant education outcomes."
Early-stage or exploratory research "examines relationships among important constructs in education and learning to establish logical connections that may form the basis for future interventions or strategies to improve education outcomes. These connections are usually correlational rather than causal."
Design and development research "develops solutions to achieve a goal related to education or learning, such as improving student engagement or mastery of a set of skills." Research projects may include pilot tests.
Efficacy research "allows for testing of a strategy or intervention under "ideal" circumstances, including with a higher level of support or developer involvement than would be the case under normal circumstances."
Effectiveness research "examines effectiveness of a strategy or intervention under circumstances that would typically prevail in the target context."
Scale-up research "examines effectiveness in a wide range of populations, contexts, and circumstances, without substantial developer involvement in implementation or evaluation." (pp. 8-9)
Digital Promise (2015) developed Evaluating Studies of Ed-Tech Products, a tool to "help district leaders evaluate studies on ed-tech product effectiveness in order to decide whether it is necessary to run a pilot" (Purpose section). There are 12 questions within four sections: "product information, study relevancy, study source, and study design" (Instructions section).
What Works Clearinghouse (WWC) includes intervention reports on multiple math programs. The following are illustrations of those:
Cognitive Tutor Algebra I (June 2016) was found to have a "medium to large" effect on math achievement.
Connected Mathematics Project (January 2010) for grades 6-8 was found to have no discernible effects on math achievement.
Everyday Mathematics (November 2015) for preK-6 was found to have potentially positive effects on math achievement for elementary students.
The Expert Mathematician (October 2006) for middle school students was found to have a potentially positive effect on math achievement.
I CAN Learn (February 2012) was found to have no discernible effects on math achievement for high school students.
I CAN Learn Pre-Algebra and Algebra (March 2009) was found to have positive effects on math achievement.
Saxon Math (May 2013) for grades K-5 was found to have potentially positive effects on mathematics achievement for elementary school students.
Saxon Algebra I (May 2016) was found to have no discernible effect on secondary students.
Scott Foresman-Addison Wesley Elementary Mathematics (May 2013) for grades preK-6 was found to have mixed effects on mathematics achievement for elementary school students.
University of Chicago School Mathematics Project Algebra (May 2016) was found to have potentially positive effects on general mathematics achievement and algebra for secondary students.
The U.S. Department of Education announced a study on technology's role in raising achievement on February 13, 2004, which would be of interest to mathematics educators. The study would use a random-assignment design to evaluate 16 computer-based reading and math products over three years to determine the effectiveness of technology in bolstering student achievement. Mathematica Policy Research, Inc. and SRI International would be involved in the study to assess the effectiveness of learning technology in teaching reading in grade 1, reading comprehension in grade 4, pre-algebra in grade 6, and algebra in grade 9. Pre-algebra products included in the study would be Successmaker from Pearson Digital Learning, SmartMath from CompuTaught, Inc., Achieve Now from PLATO Learning, Inc. and Larson Pre-Algebra from Meridian Creative Group. Algebra products include Cognitive Tutor from Carnegie Learning, Inc., Algebra from PLATO Learning, Inc., and Larson Algebra from Meridian Creative Group. The study actually was conducted on 15 products.
HOT: On April 4, 2007, the U.S. Department of Education released its report for Congress, Effectiveness of Reading and Mathematics Software Products: Findings from the First Student Cohort. It received immediate reaction from leaders around the country concerned about the effectiveness of technology in education and results of this study. A key finding noted by Mathematica Policy Research, Inc. indicated, "Test scores were not significantly higher in classrooms using the reading and mathematics software products than those in control classrooms. In each of the four groups of products-reading in first grade and in fourth grade, mathematics in sixth grade, and high school algebra-the evaluation found no significant differences in student achievement between the classrooms that used the technology products and classrooms that did not." Read this full report: http://ies.ed.gov/ncee/pubs/20074005/
Readers are cautioned about making quick decisions about technology effectiveness based on the results of this study, as products were not implemented as intended, nor used to an extent to make a difference in achievement.
Those who are not familiar with reading and conducting research studies will appreciate a few tips and documents on this topic. Commentary on and additional resources for conducting Action Research are also presented.
A Policymaker's Primer on Education Research: How to Understand It, Evaluate It, and Use It (February, 2004) is by Patricia Lauer at Mid-continent Research for Education and Learning. The Primer is intended to help readers understand what education research says, whether it's trustworthy and what it means for policy. Readers will also learn some of the technical statistical and scientific concepts touched upon in research reports and gain a deeper understanding of education research methodology. Practical tools, including a flowchart for analyzing research and an understanding statistics tutorial, are included.
How to Read a Research Study Article D. Blewett of the College of DuPage in Illinois provides advice on how to read a research study article. A research article generally is structured with an Abstract, Introduction, Methods, Results, Discussion, and References. Each section has specific content, which he summarizes. It will take more than one reading to fully understand the bulk of the research, but the general start is to read the abstract, the first paragraph or so of the introduction and the hypothesis, then skip to the discussion to find how the study turned out. Go back to read the sections on methods focusing on how the hypotheses were tested, results, and re-read the discussion section. Finally, read the entire report again from first page to last for greater understanding.
Identifying and Implementing Educational Practices Supported by Rigorous Evidence: A User Friendly Guide, an 18-page guide from The U.S. Department of Education, will help educators determine whether an educational intervention is supported by rigorous evidence. It contains a three-step evaluation process, a checklist to use in the process, definitions of research terms, and what to look for in research studies.
Making Sense of Research for Improving Education, an issue brief (April 2003) from the Charles A. Dana Center at the University of Texas at Austin, will help practitioners clarify their understanding of what scientifically based research is and learn more about research designs used in education research: experimental, quasi-experimental, correlational, and case study. When using research to make decisions about teaching and learning, practitioners should consider the research's relevance, generalizability to their particular circumstances, statistical soundness, and preponderance of evidence.
What Works Clearinghouse Glossary is intended to help users to understand the terms used in education research so that they can be more effective consumers of such research.
The following online texts are for anyone who really wants to understand the statistics involved in research:
HyperStat Online Statistics Textbook by D. Lane also contains links to free statistical analysis tools and instructional demos.
Introductory Statistics: Concepts, Models, and Applications by D. W. Stockburger. This Web Edition also includes many examples showing how to use SPSS/WIN 7.0 to do statistical procedures.
Introductory Statistics is a free text from OpenStax CNX, operating out of Rice University. This one-semester course is mainly for students who are not majoring in math or engineering. "It focuses on the interpretation of statistical results, especially in real world settings." It's available at http://cnx.org
Statistics Every Writer Should Know by Robert Niles. Learn about mean, median, percent, per capita, standard deviation, margin of error, data analysis, and more. Link to sites for data sources and interactive help to select the right statistical test.
StatSoft, Inc. (2004). Electronic Statistics Textbook. Tulsa, OK: StatSoft. According to StatSoft developers, "The Electronic Textbook begins with an overview of relevant elementary concepts and continues with a more in depth exploration of specific areas of statistics, organized by "modules," accessible by buttons, representing classes of analytic techniques. A glossary of statistical terms and a list of references for further study are included."
SticiGui by P. B. Stark is an online introductory statistics text. According to Stark, materials "include interactive data analysis and demonstrations, machine-graded online assignments and exams (a different version for every student), and a text with dynamic examples and exercises, applets illustrating key concepts, and an extensive glossary."
Visual ANOVA by T. Malloy is a tool "meant to direct your attention to relationships among the components of ANOVA (Analysis of Variance) by representing them visually." It is designed to help you understand the F test.
In general, research studies involve qualitative, quantitative, or mixed methodologies. Research is conducted to solve problems or contribute to their solutions. It involves identifying and justifying an issue you wish to investigate, which is framed as a problem statement. To better understand the problem, you gather and align resources (e.g., people, policies, data, what others have done in regard to the problem, etc.) and information to study the issue (a review of published literature on the problem). A plan (methodology) is formed to conduct the research and then it is carried out. The research is analyzed to develop a solution. Recommendations are made for creating an action plan to implement ideas and results of the research. Finally, results and recommendations are released, perhaps via a presentation or published document.
Do you need to evaluate the effectiveness of educational technology products in your school?
The U.S. Department of Education, Office of Educational Technology, in partnership with Mathematica Policy Research, developed the Ed Tech RCE Coach. This web-based interactive tool is free. It guides administrators through the process of conducting rapid-cycle evaluation research with a goal of providing evidence of effectiveness of a particular technology for continued use in the school or district, or for its best implementation.
Per Mathematica (2015-2017), the tool guides users to follow five steps: get started, plan the research, prepare the data, analyze the data, summarize the findings. The tool recommends the approach to evaluate technology, helps in determining the research question, includes steps for identifying data sources and creating a clean data file, analyzes the data and delivers results, and compiles results into a document that can be shared.
American Evaluation Association identifies online resources of interest to anyone doing evaluations. Their collection includes online multi-chapter handbooks and texts on various methodologies, software links for qualitative data analysis and developing and administering surveys, and more.
Brief Guide to Questionnaire Development by Robert Frary (2000, Office of Measurement and Research Service, Virginia Polytechnic Institute) is useful for collecting factual information and opinions in survey research.
Educational Research Basics by Del Siegle at University of Connecticut introduces several topics to beginning researchers: types of research, ethics and informed consent, single subject research, qualitative research, content analysis, historical research, action research, experimental research, statistical tests (e.g., correlations, t-tests, ANOVA, regression, Chi-Square) and much more.
Electronic Resources for Research Methods by T. D. Wilson, Ph.D., Professor Emeritus, University of Sheffield, UK., is comprehensive.
International Journal of Qualitative Methods "(ISSN: 16094069) is a peer reviewed open-access journal of the International Institute for Qualitative Methodology at the University of Alberta, Canada, and its international affiliates. The journal "publishes papers that report methodological advances, innovations, and insights in qualitative or mixed methods studies; it also publishes funded full studies using qualitative or mixed-methods" (Home page description).
Multidisciplinary methods in educational technology research and development (2007), by Justus Randolph, is an open-source book published by Hämeenlinna, FI: HAMK Press.
Research 101 from the University of Washington addresses how to conduct research, including how to distinguish between scholarly/popular communications and primary/secondary sources, which are often problems for learners.
Research Methods Knowledge Base is a fully hyperlinked online text by Dr. William M.K. Trochim of Cornell University. "It covers the entire research process including: formulating research questions; sampling (probability and nonprobability); measurement (surveys, scaling, qualitative, unobtrusive); research design (experimental and quasi-experimental); data analysis; and, writing the research paper. It also addresses the major theoretical and philosophical underpinnings of research including: the idea of validity in research; reliability of measures; and ethics."
Research Randomizer, part of the Social Psychology Network, is a free "quick way to generate random numbers or assign participants to experimental conditions. Research Randomizer can be used in a wide variety of situations, including psychology experiments, medical trials, and survey research." This is a great tool for random sampling and random assignment, which includes tutorials.
ResearchReady is cloud-based and teaches students in K-12 or college the entire research process. You can also evaluate web pages with this resource. Assessments are available to help monitor understanding of research skills. A free-trial is available and pricing options for schools.
The Software & Information Industry Association (SIIA), the principal association representing the software and digital content industries, is conscious of contributing research on product effectiveness. The SIIA released a report on May 13, 2010, Conducting and Reporting Product Evaluation Research: Guidelines and Considerations for Educational Technology Publishers and Developers," which provides 22 standards of research best practices for publishers and developers of educational software and other instructional technologies. The guidelines do not endorse use of a particular research methodology, but do provide a perspective to consider if you are planning a research study. They can also benefit school leaders who select and implement technology-based products and services, according to SIIA's media release at that time.
Survey and Questionnaire Design is a free tutorial of over 20 pages from StatPac Inc. on the complete process to conduct a survey study and design a questionnaire. Highly recommended.
Tools for Preparing Literature Reviews from George Washington University is intended primarily for masters and doctoral learners. There are tutorials for conducting literature searches efficiently, assessing whether or not findings in reports can be relied upon, and integrating various studies on a topic.
The following are free or low-cost tools for conducting online surveys:
NVivo from QSR International is "software that supports qualitative and mixed methods research. It lets you collect, organize, and analyze content from interviews, focus group discussions, surveys, and ... social media data, YouTube videos, and web pages" (Product information section). A variety of pricing options is available, and there is a free trial.
Action research "refers to a disciplined inquiry done by a teacher with the intent that the research will inform and change his or her practices in the future. This research is carried out within the context of the teacher’s environment—that is, with the students and at the school in which the teacher works—on questions that deal with educational matters at hand" (Ferrance, 2000, p. 7). While it is often done by individual teachers, the research might also involve collaborative groups of teachers, or be school-wide or district-wide. Any number of concerns might be addressed. However, the overriding goal is to make things better and to improve the quality of educational outcomes. For example, "[t]he teacher may be seeking solutions to problems of classroom management, instructional strategies, use of materials, or student learning" (p. 9). Collectively, "a school may have a concern about the lack of parental involvement in activities, and is looking for a way to reach more parents to involve them in meaningful ways" or staff might wish "to examine their state test scores to identify areas that need improvement, and then determine a plan of action to improve student performance" (p. 10).
According to John Tillotson (2000), a reason why teachers lack interest in educational research is that many of the topics chosen for study seldom have direct implications for what happens in the classroom. "One doesn't have to look far in most public schools today to find outdated teaching practices and assessment strategies, in spite of ample research findings that suggest more effective alternatives" (p. 31, par. 3). He suggested that an expansion of action research at the K-12 level is a promising solution to the dilemma of research failing to inform practice.
Perhaps one of the reasons why action research might not be conducted is that it "has often been stereo-typed as the poor second cousin to "real" research, which is distinguished by quantitative methods and experimental designs" (Reeves, 2010, p. 5). Yet, Douglas Reeves (2010) stated, "Experimental research and action research are not in opposition to one another; the latter applies the former to the real world of the classroom" (p. 5).
Action research, in this author's view, would satisfy a mandate for educators to employ researched-based instructional materials and methodologies in their instruction that get results. It fits the exercise of using a district's professional wisdom based on "locally collected performance data that indicates whether changes are occurring in the desired direction when a particular program or practice is implemented" (Mageau, 2004, p. 34). However, action research serves for the solution of a local problem (e.g., How can we make an effective change that would be beneficial here?) or provides a framework for exploration and would not necessarily be appropriate if a researcher wants to ultimately generalize the results from a question under investigation to other settings. One also needs to keep in mind that research questions, goals, or purposes can evolve during action research.
Although action research is “usually qualitative and participatory” (Dick, 1997, par. 5), Bob Dick (1998a) acknowledges that “action research is not synonymous with ‘qualitative’ either” (para. 7). Essentially, the action research cycle involves problem formulation, data collection, data analysis, reporting results, and action planning (Tillotson, 2000). The research study has many cycles, the theories of which are intended to guide actions, such as: "In situation S, to produce outcomes O1, O2, ..., try actions A1, A2, ..." Further, "[i]n specifying the important features of the situation it also allows its generalizability to be tested in other similar settings" (Dick, 1998b, para. 7).
Sometimes those who conduct action research have not employed its methodology correctly so that the researcher's plan might be replicated by others. Poor planning might lead to disappointing results. While action research "is not a substitute for quantitative research," it is a "contextual lens for other research." As the teacher is the researcher, a limitation to action research is the teacher bias, and a "good deal of opinion and rhetoric" (Reeves, 2010, p. 75). If the method is employed, Reeves' model of action research that was tested with 81 teams of teachers and administrators might be considered. There are four elements (pp. 80-81):
The following resources provide more information on the process and examples from completed studies:
You don't need perfect research to support a proposed change!
Douglas Reeves (2006) reminded us, "There are hardly any true randomly assigned groups in educational research, largely due to ethical constraints" (p. 97). "The quality model that prevails throughout successful organizations is not waiting for perfection but rather 'Try it, test it, improve it.' " (p. 98).
Don't forget to teach your students how to conduct research!
The Kentucky Virtual Library How to Do Research is an interactive and engaging infographic designed for young learners. It's high on the list of resources on this topic and provides a step-by-step tutorial: how to plan, where to find information, how to take notes, how to use the information, and then create the report to share with others. Finally, it includes questions to evaluate your efforts.
In Researching in a Digital World: How do I teach my students to conduct quality online research?, Erik Palmer (2015) presented a step-by-step guide to teach learners at all grade levels how to conduct more responsible research in an internet environment.
ReadWriteThink.org from the National Council of Teachers of English has a series of Classroom Resources. Among those you will find six lessons for grades 3-5 that illustrate what research looks like. Enter the search phrase: Research Building Blocks. The six lessons take students through the process of finding sources, exploring information in those sources, gathering details, and citing the sources that they use.
Introduction to Research in the Classroom focuses on mathematics research and the Making Mathematics project from the Education Development Center, which was for grades 7-12. Related to this is an abridged version of the longer FAQ list for Making Mathematics.
Searching the Web with key phrases will yield some education research and there are scholarly search engines and academic databases that you can use, which CT4ME provides on our technology integration page for Building Internet, Search and Citation Skills. In addition to resources below, journals are a good source for education research. See the list of Journals at CT4ME.
Best Evidence Encyclopedia: http://www.bestevidence.org/ "The Best Evidence Encyclopedia is a free web site created by the Johns Hopkins University School of Education's Center for Data-Driven Reform in Education (CDDRE) under funding from the Institute of Education Sciences, U.S. Department of Education. It is intended to give educators and researchers fair and useful information about the strength of the evidence supporting a variety of programs available for students in grades K-12. The Best Evidence Encyclopedia provides summaries of scientific reviews produced by many authors and organizations, as well as links to the full texts of each review" (About the Best Evidence Encyclopedia section). You will find reviews on K-12 mathematics, reading including for struggling readers and English language learners, and comprehensive school reform. You can also subscribe to their magazine Better, which is dedicated to providing its readers with evidence-based education so they can "use what works." Mathematics educators will be particularly interested in mathematics programs that have shown evidence of helping elementary, middle and high school learners succeed, and the effectiveness of technology applications for enhancing mathematics achievement.
Center for Improving Learning of Fractions: https://sites.google.com/a/udel.edu/fractions/ administered at University of Delaware, focuses on improving math instruction for elementary and middle school children who have problems with math concepts, specifically fractions. Research has been made possible via a grant from the Institute of Education Sciences.
Center for the Study of Teaching and Policy: http://depts.washington.edu/ctpmail/ at the University of Washington "investigates efforts to improve the quality of teaching and learning, the teacher workforce, and the systems of support for teachers’ work, in various contexts and at multiple levels of the K-12 educational system."
Center on Instruction: http://www.centeroninstruction.org/resources.cfm?category=math&subcategory=&grade_start=&grade_end "The Center on Instruction offers materials and resources on mathematics to build educators’ knowledge of instruction for students with low achievement in mathematics, improve professional development models for math teachers, and build teachers’ skills in monitoring student growth toward important math outcomes."
Digital Promise Research Map: http://researchmap.digitalpromise.org/ is a web tool to find education research dating back to 2005 from nearly 100,000 journal articles in education and the learning sciences. Users can select from two different viewing formats (chord, network), each showing interconnections among topics. A list view is also available. Topics include:
Directory of Open Access Journals: http://www.doaj.org/ DOAJ aims to be comprehensive (i.e., all subjects and languages) and covers free, full text, quality controlled scientific and scholarly journals to guarantee content. Quality control means that the journals "must exercise peer-review or editorial quality control to be included." This is a huge plus for researchers.
Education Policy Analysis Archives: http://epaa.asu.edu/ojs/ a peer-reviewed online journal of education research.
Education Resources Information Center (ERIC): http://eric.ed.gov/ sponsored by the U.S. Department of Education, Institute of Education Sciences. ERIC's search can be restricted to peer-reviewed only or full text articles. Search by descriptors such as mathematics instruction, mathematics achievement, mathematics education, academic achievement, teaching methods, program effectiveness, and more. You can search by source, author, publication date, publication type, education level, and audience. There is a Thesaurus that has multiple subcategories and related mathematical terms.
Educational Research Newsletter & Webinars: http://www.ernweb.com/ "Since 1988, Educational Research Newsletter has informed educators of recent research on reading, math, behavior management and raising student achievement with brief reports on the most useful and relevant findings from leading journals and organizations."
Evidence for ESSA: http://www.evidenceforessa.org/ is a free website from the Center for Research and Reform in Education at Johns Hopkins University. It's purpose is to provide educators with the most up-to-date and reliable information regarding K-12 programs (e.g., in math and reading) that meet the strong, moderate, and promising evidence criteria per the ESSA.
Institute of Education Sciences: http://ies.ed.gov/ is the primary research arm of the United States Department of Education. IES brings rigorous and relevant research, evaluation and statistics to our nation's education system.
International Association for the Evaluation of Educational Achievement (IEA): http://www.iea.nl/ is an independent, international cooperative of national research institutions and governmental research agencies. IEA has conducted more than 23 research studies of cross-national achievement since 1958. Examples include Trends in Mathematics and Science Study (1995, 1999, 2003, 2007), the Progress in International Reading Literacy Study (2001, 2006), the TIMSS-R Video Study of Classroom Practices, information technology in education (SITES-M1, SITES-M2, SITES 2006), and the Teachers Education and Development Study in Mathematics (TEDS-M), initiated in 2005. Publications are available, as well as current studies (e.g., TIMSS Advanced 2008 and PIRLS 2011).
International TIMSS and PIRLS Study Center: http://timss.bc.edu/, located at Boston College, is IEA's principle site for reporting on the Trends in Mathematics and Science studies and the Progress in International Reading Literacy studies.
K-12 Computing Blueprint Research Watch: http://www.k12blueprint.com/research-watch contains summaries that examine the research. This is a good starting point to locating reports on the full studies in K-12 on topics such as one-to-one initiatives, e-learning, mobile devices, and technology effectiveness.
Learning and Technology Library (formerly EdITLib, Education & Information Technology Library): http://www.learntechlib.org/ LearnTechLib, sponsored by the Association for the Advancement of Computing in Education, is a digital library of "peer-reviewed research on the latest developments and applications in Learning and Technology." There are "100,000+ documents of published international journal articles, conference papers, e-books, and multimedia content from 200,000+ leading authors" (About, Content section).
MERLOT: http://www.merlot.org/merlot/index.htm Find peer-reviewed online teaching and learning materials in numerous categories. Education, and mathematics/statistics are among those.
National Assessment of Educational Progress (NAEP), the Nation's Report Card: http://nces.ed.gov/nationsreportcard/ "is the only nationally representative and continuing assessment of what America's students know and can do in various subject areas. Assessments are conducted periodically in mathematics, reading, science, writing, the arts, civics, economics, geography, U.S. history and beginning in 2014, in Technology and Engineering Literacy" (About section). Representative samples of students from grades 4, 8, and 12 are selected for main assessments, but not all grades are assessed each time.
National Center for Education Statistics (NCES): http://nces.ed.gov/ is the primary federal agency for collecting and analyzing data related to education. Publications and annual reports are available.
National Council of Teachers of Mathematics: News, Research & Advocacy: http://www.nctm.org/news/ Read summaries, clips, and briefs connecting math education research to the classroom. You'll also find a list of books on research.
Promising Practices Network: http://www.promisingpractices.net/default.asp For example, see Team Accelerated Instruction [TAI]: Math, a program rated as "promising." TAI is a component of the MathWings program from the Success for All Foundation. Probable implementers are elementary schools. Other math programs (e.g., Everyday Mathematics, I CAN Learn Pre-Algebra and Algebra, Saxon Middle School Math, The Expert Mathematician, University of Chicago School Mathematics Project Algebra) not yet fully evaluated by PPN, but highly valued by other credible organizations, are listed among "Screened Programs": http://www.promisingpractices.net/programs_topic_list.asp?topicid=21
Rand Education: http://www.rand.org/education.html, a division of the Rand Corporation, includes numerous research areas with in-depth content available, which you can access via a drop-down menu at the top of home page. For example, the section on Education and the Arts includes narrower topics such as academic achievement; education curriculum, policy, reform, legislation; educational technology, and if you expand the list to "Explore All Topics," you'll find mathematics (under M) and teachers and teaching (under T), and much more. You'll find abstracts, many full journal articles, policy reports, research briefs, news releases, and more.
Society for Research on Educational Effectiveness: http://www.sree.org/ advances and disseminates "research on the causal effects of education interventions, practices, programs, and policies." Three publications are sponsored:
What Works Clearinghouse: http://ies.ed.gov/ncee/wwc/ Easily find what works among the many topics addressed and within publications and reviews.
Are you interested in the "No Significant Difference Phenomenon"?
Readers might be interested in The No Significant Difference website, which is a companion to Thomas L. Russell's book, The No Significant Difference Phenomenon: A Comparative Research Annotated Bibliography on Technology for Distance Education (2001, IDECC, fifth edition). The book is a research bibliography documenting no significant difference in student outcomes based on the mode of education delivery (face to face or at a distance). Note: by distance, we mean instruction delivered via media such as radio, television, video, online and other technologies, which have been used historically and in current studies. In addition to studies that document no significant difference, the website includes studies which do document significant differences in student outcomes based on the mode of education delivery. Studies are listed from 1928 to the present with a search engine to find those of particular interest.
Are you interested in the current and future federal role in education research? If so, the following document is just what you need.
The Aspen Institute. (2013, December 3). Leveraging learning: The evolving role of federal policy in education research. Washington, DC: Author. Retrieved from https://www.aspeninstitute.org/ This document contains a "series of essays, infographics, and briefs that outline the current federal landscape of education R&D and envision its future." It begins with an essay titled A Brief History of Federal Efforts to Improve Education Research. The section on the Current Federal Role in Education Research identifies research and development centers throughtout the U.S. and also provides an overview of investing in innovation. Highlights within the section on the Future of the Federal Role in Education Research include Why We Need a DARPA for Education (ARPA-ED) and New Directions in Education Research: The Federal Role.
114th Congress of the United States. (2015). Every Student Succeeds Act. Retrieved from http://www.ed.gov/esea
Andrews, D. (2012, Winter). In search of feasible fidelity. Better: Evidence-based Education. Retrieved from http://www.bestevidence.org/word/better_pages_4_5.pdf
Dick, B. (1998a). Rigour (1). Occasional pieces in action research methodology, # 13. Retrieved from http://www.scu.edu.au/schools/gcm/ar/arm/op000.html
Dick, B. (1998b). Grounded theory (2). Occasional pieces in action research methodology, # 17. Retrieved from http://www.scu.edu.au/schools/gcm/ar/arm/op017.html
Dick, B. (1997). What is "action research"? Occasional pieces in action research methodology, # 2. Retrieved from http://www.scu.edu.au/schools/gcm/ar/arm/op002.html
Digital Promise. (2015). Evaluating studies of ed-tech products. Retrieved from http://www.digitalpromise.org/blog/entry/how-strong-is-the-evidence-a-tool-to-evaluate-studies-of-ed-tech-products
Dynarski, M. (2015, December 10). Using research to improve education under the Every Student Succeeds Act. Retrieved from http://www.brookings.edu/research/reports/2015/12/10-improve-education-under-every-student-succeeds-act-dynarski
Ferrance, E. (2000). Action research. Northeast and Islands Regional Educational Laboratory at Brown University: Themes in Education series. Retrieved from http://www.brown.edu/academics/education-alliance/sites/brown.edu.academics.education-alliance/files/publications/act_research.pdf
Institute of Education Sciences & National Science Foundation. (2013, August). Common guidelines for education research and development. Washington, DC: Authors. Retrieved from http://ies.ed.gov/pdf/CommonGuidelines.pdf
Laitsch, D. (2003, August). Into the mix: Policy, practice, and research. ASCD InfoBrief, Issue 34. Available in Archived Issues: http://www.ascd.org/publications/newsletters/policy-priorities/aug03/num34/toc.aspx
Mageau, T. (2004, January). Determining 'What Works'. T.H.E. Journal, 31(6), 32-27.
Mathematica Policy Research. (2015-2017). Rapid-cycle tech evaluations accelerate decision making. Retrieved from https://www.mathematica-mpr.com/our-publications-and-findings/projects/rapid-cycle-tech-evaluation
Mathematica Policy Research. (2016). What Works Clearinghouse: 2007-2018. Retrieved from https://cire.mathematica-mpr.com/our-publications-and-findings/projects/what-works-clearinghouse
North Central Regional Educational Laboratory. (2004). NCREL quick key 7: Understanding the No Child Left Behind Act of 2001: Scientifically based research. Naperville, IL: Learning Point Associates. Available from http://files.eric.ed.gov/fulltext/ED518709.pdf
Reeves, D. (2010). Transforming professional development into student results. Alexandria, VA: ASCD.
Reeves, D. (2006). The learning leader: How to focus school improvement for better results. Alexandria, VA: ASCD.
Simpson, R. L., LaCava, P. G., & Graner, P. S. (2004). The No Child Left Behind Act: Challenges and implications for educators. Intervention in School and Clinic, 40(2), 67-75.
Slavin, R. (2017, February 9). Evidence for ESSA and the What Works Clearninghouse [Web log post]. Retrieved from http://www.huffingtonpost.com/entry/evidence-for-essa-and-the-what-works-clearinghouse_us_589c7643e4b02bbb1816c369
Slavin, R. (2015, December 8). Evidence and the ESSA [Web log post]. Retrieved from http://www.huffingtonpost.com/robert-e-slavin/evidence-and-the-essa_b_8750480.html
Tillotson, J. W. (2000). Studying the game: Action research in science education. The Clearing House, 74(1), 31-34.
Viadero, D. (2007, July 12). U.S. gives What Works Clearinghouse to new contractor. Education Week, 26(43). Retrieved from http://www.edweek.org/ew/articles/2007/07/18/43whatworks.h26.html?tmp=541546481
Go to Related Topic: State and National Education Standards and The Best Rated Standards Resources