Computing Technology for Math Excellence Logo

 

 

 

 

 

 

 

 

Black line

 Research in Education Corner

Conducting Research and Finding Education Research on the Web

 

Man with a stack of booksInternet World

The Research in Education Corner includes supporting pages of resources associated with State and National Standards.

Arrow: You are hereResearch Resources (Page 1 of 2) includes:

Research Resources (Page 2 of 2) includes summaries of selected research and resources related to:

  • Standards, Raising Achievement, Assessment, How People Learn
  • Technology Integration

Adobe Reader GifGet Adobe Acrobat Reader, free software for pdf files, which appear on this page.

Black line

NCLB and Scientifically Based Research

 

Get the latest reports on what's happening at the classroom, state, and national levels at the Center for Education Policy.  You'll find reports in the categories for Testing, No Child Left Behind, High School Exit Exams, Student Achievement, and Standards-based Education Reform particularly valuable.

Readers might be interested in How and why standards can improve student achievement by Scherer (2001).

 

According to Dan Laitsch (2003), "To access much of the federal funding allocated through NCLB, states and districts will be required to adopt programs and policies that are supported by scientifically based research, and teachers will need to adapt their practice to reflect the competencies necessary to implement the new programs" (para. 2).  Educators might require professional development on conducting research and implementation of new curricula.  CT4ME's Research in Education Corner will give you a good start on this topic, as well as how to conduct your own action research.

North Central Regional Educational Laboratory (2004) discussed and provided examples of the six components of scientifically based research (SBR).  SRB must:

  1. Use empirical methods.
  2. Involve rigorous and adequate data analyses.
  3. Rely on measurements or observational methods that provide reliable and valid data.
  4. Use either an experimental or quasi-experimental design.
  5. Allow for replicability.
  6. Undergo expert scrutiny. (pp. 3-4)

The U.S. Department of Education set up the What Works Clearinghouse (WWC) in 2002 to provide easily searchable databases containing such scientific evidence.  During its first year, the WWC focused on seven topics: interventions for beginning reading; curriculum-based interventions for increasing K-12 math achievement; preventing high school dropout; increasing adult literacy; peer-assisted learning in elementary schools for reading, mathematics, and science gains; interventions to reduce delinquent, disorderly, and violent behavior in middle and high schools; and interventions for elementary school English language learners.

The WWC defined educational interventions as a product, practice, policy, or program that has been shown to be effective.  In Determining 'What Works,' Therese Mageau (2004) of T.H.E. Journal and Dr. Grover 'Russ' Whitehurst, who was director of the Institute of Education Sciences at that time, discussed the question of effectiveness.  When the WWC was started, it gave preference to studies involving randomized trials because randomized trials are the gold standard for determining effectiveness.   Whitehurst commented, however,  that "there are two prongs to evidence-based practice" (p. 34).  Something that may work in one location might not work in another.  Therefore, evidence-based practice includes the scientific studies of effectiveness as found on the WWC, and the integration of professional wisdom based on "locally collected performance data that indicates whether changes are occurring in the desired direction when a particular program or practice is implemented" (p. 34).  In June 2010, the WWC expanded its acceptable research standards to include nonexperimental single-case design and regression-discontinuity studies, documentation of which can be found at the WWC website.

Thus, in order for educators to take advantage of results of studies, employing their professional wisdom, it is important for educators to read and analyze professional journals and documents from research organizations reporting on results of studies.  According to Simpson, LaCava, and Graner (2004), educators should be able to clearly identify the following:

"Unfortunately, "[e]vidence-based practices are not being widely adopted in classrooms," according to David Andrews (2012, p. 5) of Johns Hopkins University School of Education. There's much to consider.  "When deciding whether to adopt an evidence-based approach, educators should weigh the costs and benefits, and be prepared to implement it with fidelity. ... Fidelity means implementing the approach exactly as it was developed and tested" (p. 4), which is not an easy task.  Andrews provided some key questions to consider when conducting a cost-benefit analysis, as costs are both direct and indirect.  Sustainability of the innovation is also an issue, as enthusiasm for the innovation must be maintained.

 

Man reading document on fireHOT NEWS: The Institute of Education Sciences announced on July 11, 2007 that the U.S. Department of Education awarded Mathematica Policy Research, Inc. a five-year contract to take over management of the What Works Clearinghouse (Viadero, 2007).  The WWC got off to a rocky start in identifying high-quality research that met its standards, but "the Web site now lists 74 reviews of research on reading instruction, dropout prevention, teaching English-language learners, and other topics" (para. 5).  Mark Dynarski, who is a senior fellow and an associate director of research at Mathematica, will direct the WWC.  It is hoped that this change will make the WWC more responsive and relevant to the needs of educators, as there is a call for "expanding the range of research designs that qualify as sound evidence, and introducing practical guides and other products that educators and decision makers in the trenches might see as useful" (Mission Shift section).

Even better news--

In August 2013, the Institute of Education Sciences and the National Science Foundation released their joint report: Common Guidelines for Education Research and Development. The document describes "shared understandings of the roles of various types or "genres" of research in generating evidence about strategies and interventions for increasing student learning. ... [and] describes the agencies’ expectations for the purpose of each type of research, the empirical and/or theoretical justifications for different types of studies, types of project outcomes, and quality of evidence" (p. 7).  Six types of research are examined:

Foundational research "provides the fundamental knowledge that may contribute to improved learning and other relevant education outcomes."

Early-stage or exploratory research "examines relationships among important constructs in education and learning to establish logical connections that may form the basis for future interventions or strategies to improve education outcomes. These connections are usually correlational rather than causal."

Design and development research "develops solutions to achieve a goal related to education or learning, such as improving student engagement or mastery of a set of skills." Research projects may include pilot tests.

Efficacy research "allows for testing of a strategy or intervention under "ideal" circumstances, including with a higher level of support or developer involvement than would be the case under normal circumstances."

Effectiveness research "examines effectiveness of a strategy or intervention under circumstances that would typically prevail in the target context."

Scale-up research "examines effectiveness in a wide range of populations, contexts, and circumstances, without substantial developer involvement in implementation or evaluation." (pp. 8-9)

 

Back to top

 

Efforts to Determine Product Effectiveness

 

Interested readers will find WWC reports on the following math programs at http://ies.ed.gov/ncee/wwc/reports/default.aspx along with other materials under review:

The U.S. Department of Education announced a study on technology's role in raising achievement on February 13, 2004, which would be of interest to mathematics educators.  The study would use a random-assignment design to evaluate 16 computer-based reading and math products over three years to determine the effectiveness of technology in bolstering student achievement.  Mathematica Policy Research, Inc. and SRI International would be involved in the study to assess the effectiveness of learning technology in teaching reading in grade 1, reading comprehension in grade 4, pre-algebra in grade 6, and algebra in grade 9.  Pre-algebra products included in the study would be Successmaker from Pearson Digital Learning, SmartMath from CompuTaught, Inc., Achieve Now from PLATO Learning, Inc. and Larson Pre-Algebra from Meridian Creative Group.  Algebra products include Cognitive Tutor from Carnegie Learning, Inc., Algebra from PLATO Learning, Inc., and Larson Algebra from Meridian Creative Group.  The study actually was conducted on 15 products.

Paper on fire for hot newsHOT:  On April 4, 2007, the U.S. Department of Education released its report for Congress, Effectiveness of Reading and Mathematics Software Products: Findings from the First Student Cohort. It received immediate reaction from leaders around the country concerned about the effectiveness of technology in education and results of this study. A key finding noted by Mathematica Policy Research, Inc. indicated, "Test scores were not significantly higher in classrooms using the reading and mathematics software products than those in control classrooms. In each of the four groups of products-reading in first grade and in fourth grade, mathematics in sixth grade, and high school algebra-the evaluation found no significant differences in student achievement between the classrooms that used the technology products and classrooms that did not." Read this full report: http://ies.ed.gov/ncee/pubs/20074005/

Readers are cautioned about making quick decisions about technology effectiveness based on the results of this study, as products were not implemented as intended, nor used to an extent to make a difference in achievement.

 

Readers might also be interested in the K-12 Evidence Based Math Approved Listing in the Curriculum section for the Idaho Department of Education.  This is the 2008 Math Adoption Guide resulting from Idaho's efforts to identify the scientifically based research behind publishers' materials in their textbook adoption efforts.  The list is effective through December 2014.

The Idaho committee of teachers rated curricular materials using an Evidence-Based Research Rubric: Continuum of Evidence of Effectiveness with three levels: most rigorous, somewhat rigorous, and marginal.  The rubric addresses theory/research foundation, evaluation-based evidence of effectiveness, implementation, and replicability.  Waterford Early Math and Science (K-1, version 4.2) and Connected Mathematics 2 (6-8, copyright 2009) fall into the "Most Rigorous" category.  See the listing for other results.

 

Back to top

 

Reading and Conducting Research

 

Those who are not familiar with reading and conducting research studies will appreciate a few tips and documents on this topic. Commentary on and additional resources for conducting Action Research are also presented.

Reading Research Studies

A Policymaker's Primer on Education Research: How to Understand It, Evaluate It, and Use It (February, 2004) is by Patricia Lauer at Mid-continent Research for Education and Learning.  The Primer is intended to help readers understand what education research says, whether it's trustworthy and what it means for policy.  Readers will also learn some of the technical statistical and scientific concepts touched upon in research reports and gain a deeper understanding of education research methodology.  Practical tools, including a flowchart for analyzing research and an understanding statistics tutorial, are included.

How to Read a Research Study Article D. Blewett of the College of DuPage in Illinois provides advice on how to read a research study article.  A research article generally is structured with an Abstract, Introduction, Methods, Results, Discussion, and References.  Each section has specific content, which he summarizes.  It will take more than one reading to fully understand the bulk of the research, but the general start is to read the abstract, the first paragraph or so of the introduction and the hypothesis, then skip to the discussion to find how the study turned out. Go back to read the sections on methods focusing on how the hypotheses were tested, results, and re-read the discussion section.  Finally, read the entire report again from first page to last for greater understanding.

Identifying and Implementing Educational Practices Supported by Rigorous Evidence: A User Friendly Guide, an 18-page guide from The U.S. Department of Education, will help educators determine whether an educational intervention is supported by rigorous evidence.  It contains a three-step evaluation process, a checklist to use in the process, definitions of research terms, and what to look for in research studies.

Making Sense of Research for Improving Education, an issue brief (April 2003) from the Charles A. Dana Center at the University of Texas at Austin, will help practitioners clarify their understanding of what scientifically based research is and learn more about research designs used in education research: experimental, quasi-experimental, correlational, and case study.  When using research to make decisions about teaching and learning, practitioners should consider the research's relevance, generalizability to their particular circumstances, statistical soundness, and preponderance of evidence.

What Works Clearinghouse Glossary is intended to help users to understand the terms used in education research so that they can be more effective consumers of such research.

Statistics in Research

The following online texts are for anyone who really wants to understand the statistics involved in research:

ANOVA Visually by T. Malloy is a tool "meant to direct your attention to relationships among the components of ANOVA (Analysis of Variance) by representing them visually."  It is designed to help you understand the F test.

HyperStat Online Statistics Textbook by D. Lane also contains links to free statistical analysis tools and instructional demos.

Introductory Statistics: Concepts, Models, and Applications by D. W. Stockburger.  This Web Edition also includes many examples showing how to use SPSS/WIN 7.0 to do statistical procedures.

Introductory Statistics is a free text by OpenStax College, operating out of Rice University.  This one-semester course is mainly for students who are not majoring in math or engineering.  "It focuses on the interpretation of statistical results, especially in real world settings."  It's available at http://cnx.org

Statistics Every Writer Should Know by Robert Niles.  Learn about mean, median, percent, per capita, standard deviation, margin of error, data analysis, and more.  Link to sites for data sources and interactive help to select the right statistical test.

StatSoft, Inc. (2004). Electronic Statistics Textbook. Tulsa, OK: StatSoft.  According to StatSoft developers, "The Electronic Textbook begins with an overview of relevant elementary concepts and continues with a more in depth exploration of specific areas of statistics, organized by "modules," accessible by buttons, representing classes of analytic techniques. A glossary of statistical terms and a list of references for further study are included."

SticiGui by P. B. Stark is an online introductory statistics text.  According to Stark, materials "include interactive data analysis and demonstrations, machine-graded online assignments and exams (a different version for every student), and a text with dynamic examples and exercises, applets illustrating key concepts, and an extensive glossary."

Statnotes: Topics in Multivariate Analysis by G. David Garson is an online text.

Conducting Research Studies

In general, research studies involve qualitative, quantitative, or mixed methodologies.  Research is conducted to solve problems or contribute to their solutions.  It involves identifying and justifying an issue you wish to investigate, which is framed as a problem statement.  To better understand the problem, you gather and align resources (e.g., people, policies, data, what others have done in regard to the problem, etc.) and information to study the issue (a review of published literature on the problem).  A plan (methodology) is formed to conduct the research and then it is carried out.  The research is analyzed to develop a solution.  Recommendations are made for creating an action plan to implement ideas and results of the research.  Finally, results and recommendations are released, perhaps via a presentation or published document.   

American Evaluation Association identifies online resources of interest to anyone doing evaluations.  Their collection includes online multi-chapter handbooks and texts on various methodologies, software links for qualitative data analysis and developing and administering surveys, and more.

Brief Guide to Questionnaire Development by Robert Frary (2000, Office of Measurement and Research Service, Virginia Polytechnic Institute) is useful for collecting factual information and opinions in survey research.

Electronic Resources for Research Methods by T. D. Wilson, Ph.D., Professor Emeritus, University of Sheffield, UK., is comprehensive.

International Journal of Qualitative Methods "(ISSN: 16094069) is a peer reviewed journal published quarterly as a web-based journal by the International Institute for Qualitative Methodology at the University of Alberta, Canada, and its international affiliates. [Its] goals are to advance the development of qualitative methods, and to disseminate methodological knowledge to the broadest possible community of academics, students, and professionals who undertake qualitative research" (Home page description). The journal is free of charge.

Multidisciplinary methods in educational technology research and development (2007), by Justus Randolph, is an open-source book published by Hämeenlinna, FI: HAMK Press.

Qualitative Research Web Ring is a database of resources on all aspects of conducting qualitative research.

Research 101 from the University of Washington addresses how to conduct research, including how to distinguish between scholarly/popular communications and primary/secondary sources, which are often problems for learners.

Research Methods Knowledge Base is a fully hyperlinked online text by Dr. William M.K. Trochim of Cornell University.  "It covers the entire research process including: formulating research questions; sampling (probability and nonprobability); measurement (surveys, scaling, qualitative, unobtrusive); research design (experimental and quasi-experimental); data analysis; and, writing the research paper.  It also addresses the major theoretical and philosophical underpinnings of research including: the idea of validity in research; reliability of measures; and ethics."

Research Randomizer, part of the Social Psychology Network, is a free "quick way to generate random numbers or assign participants to experimental conditions. Research Randomizer can be used in a wide variety of situations, including psychology experiments, medical trials, and survey research."  This is a great tool for random sampling and random assignment, which includes tutorials.

The Software & Information Industry Association (SIIA), the principal association representing the software and digital content industries, is conscious of contributing research on product effectiveness. The SIIA released a report on May 13, 2010,Conducting and Reporting Product Evaluation Research: Guidelines and Considerations for Educational Technology Publishers and Developers," which provides 22 standards of research best practices for publishers and developers of educational software and other instructional technologies.  The guidelines do not endorse use of a particular research methodology, but do provide a perspective to consider if you are planning a research study.  They can also benefit school leaders who select and implement technology-based products and services, according to SIIA's media release.  Read these guidelines at http://siia.net/presentations/education/SIIA_EvaluationGuidelines_EdTechProduct.pdf

Survey and Questionnaire Design is a free tutorial of over 20 pages from StatPac Inc. on the complete process to conduct a survey study and design a questionnaire. Highly recommended.

Tools for Preparing Literature Reviews from George Washington University is intended primarily for masters and doctoral learners.  There are tutorials for conducting literature searches efficiently, assessing whether or not findings in reports can be relied upon, and integrating various studies on a topic.

Survey Tools

The following are free or low-cost tools for conducting online surveys:

NVivo from QSR International is "software that supports qualitative and mixed methods research.  It lets you collect, organize, and analyze content from interviews, focus group discussions, surveys, and ... social media data, YouTube videos, and web pages" (Product information section).  A variety of pricing options is available, and there is a free trial.

 

Back to top

 

Action Research

Action research "refers to a disciplined inquiry done by a teacher with the intent that the research will inform and change his or her practices in the future. This research is carried out within the context of the teacher’s environment—that is, with the students and at the school in which the teacher works—on questions that deal with educational matters at hand" (Ferrance, 2000, p. 7).  While it is often done by individual teachers, the research might also involve collaborative groups of teachers, or be school-wide or district-wide.  Any number of concerns might be addressed.  However, the overriding goal is to make things better and to improve the quality of educational outcomes.  For example, "[t]he teacher may be seeking solutions to problems of classroom management, instructional strategies, use of materials, or student learning" (p. 9).  Collectively, "a school may have a concern about the lack of parental involvement in activities, and is looking for a way to reach more parents to involve them in meaningful ways" or staff might wish "to examine their state test scores to identify areas that need improvement, and then determine a plan of action to improve student performance" (p. 10).

According to John Tillotson (2000), a reason why teachers lack interest in educational research is that many of the topics chosen for study seldom have direct implications for what happens in the classroom.  "One doesn't have to look far in most public schools today to find outdated teaching practices and assessment strategies, in spite of ample research findings that suggest more effective alternatives" (p. 31, par. 3).  He suggested that an expansion of action research at the K-12 level is a promising solution to the dilemma of research failing to inform practice.

Perhaps one of the reasons why action research might not be conducted is that it "has often been stereo-typed as the poor second cousin to "real" research, which is distinguished by quantitative methods and experimental designs" (Reeves, 2010, p. 5).  Yet, Douglas Reeves (2010) stated, "Experimental research and action research are not in opposition to one another; the latter applies the former to the real world of the classroom" (p. 5).

Action research, in this author's view, would satisfy a mandate for educators to employ researched-based instructional materials and methodologies in their instruction that get results.  It fits the exercise of using a district's professional wisdom based on "locally collected performance data that indicates whether changes are occurring in the desired direction when a particular program or practice is implemented" (Mageau, 2004, p. 34).  However, action research serves for the solution of a local problem (e.g., How can we make an effective change that would be beneficial here?) or provides a framework for exploration and would not necessarily be appropriate if a researcher wants to ultimately generalize the results from a question under investigation to other settings.  One also needs to keep in mind that research questions, goals, or purposes can evolve during action research.

Although action research is “usually qualitative and participatory” (Dick, 1997, par. 5), Bob Dick (1998a) acknowledges that “action research is not synonymous with ‘qualitative’ either” (para. 7).  Essentially, the action research cycle involves problem formulation, data collection, data analysis, reporting results, and action planning (Tillotson, 2000).  The research study has many cycles, the theories of which are intended to guide actions, such as: "In situation S, to produce outcomes O1, O2, ..., try actions A1, A2, ..."  Further, "[i]n specifying the important features of the situation it also allows its generalizability to be tested in other similar settings" (Dick, 1998b, para. 7).

Sometimes those who conduct action research have not employed its methodology correctly so that the researcher's plan might be replicated by others.  Poor planning might lead to disappointing results.  While action research "is not a substitute for quantitative research," it is a "contextual lens for other research."   As the teacher is the researcher, a limitation to action research is the teacher bias, and a "good deal of opinion and rhetoric" (Reeves, 2010, p. 75).  If the method is employed, Reeves' model of action research that was tested with 81 teams of teachers and administrators might be considered.  There are four elements (pp. 80-81):

  1. There should be a research question that links professional practice to student results.  (e.g., Reeves (2010, p. 80) suggested, "How will using interactive journals influence the writing performance of second-language students in a 7th grade math class?")
  2. The student population should be described (e.g., grade level, special characteristics, demographics and educational factors of participating students).
  3. Student achievement data should be collected (e.g., year-end tests, formative assessments, classroom observations, etc. to allow for systematic observations of changes in achievement).  Effectiveness is enhanced if there are several measurements throughout the project.
  4. The research plan should also contain specific professional practices to be observed (e.g., feedback intervals compared to number of questions posed on tests for a particular instructional strategy).  Reeves (2010) stated, "Ideally, the action researchers will create a scoring rubric with a range of performance over three or four different levels so that an objective observation can be made about the extent to which a particular practice was applied in the classroom" (p. 81).

The following resources provide more information on the process and examples from completed studies:

 

Friendly reminder Gif

You don't need perfect research to support a proposed change!

Douglas Reeves (2006) reminded us, "There are hardly any true randomly assigned groups in educational research, largely due to ethical constraints" (p. 97).  "The quality model that prevails throughout successful organizations is not waiting for perfection but rather 'Try it, test it, improve it.' " (p. 98).

Don't forget to teach your students how to conduct research!

ReadWriteThink.org from the National Council of Teachers of English has a series of six lessons that illustrate what research looks like in the elementary classroom. Look at grades 3-5 lessons on Research Building Blocks, which take students through the process of finding sources, exploring information in those sources, gathering details, and citing the sources that they use.

 

Back to top

 

Finding Education Research

 

Searching the Web with key phrases will yield some education research and there are scholarly search engines and academic databases that you can use, which CT4ME provides on our technology integration page for Building Internet, Search and Citation Skills.  In addition to resources below, journals are a good source for education research.  See the list of Journals at CT4ME.

Best Evidence Encyclopedia: http://www.bestevidence.org/ "The Best Evidence Encyclopedia is a free web site created by the Johns Hopkins University School of Education's Center for Data-Driven Reform in Education (CDDRE) under funding from the Institute of Education Sciences, U.S. Department of Education. It is intended to give educators and researchers fair and useful information about the strength of the evidence supporting a variety of programs available for students in grades K-12. The Best Evidence Encyclopedia provides summaries of scientific reviews produced by many authors and organizations, as well as links to the full texts of each review" (About the Best Evidence Encyclopedia section).  You will find reviews on K-12 mathematics, reading including for struggling readers and English language learners, and comprehensive school reform.  You can also subscribe to their magazine Better, which is dedicated to providing its readers with evidence-based education so they can "use what works."  Mathematics educators will be particularly interested in mathematics programs that have shown evidence of helping elementary, middle and high school learners succeed, and the effectiveness of technology applications for enhancing mathematics achievement.

Center for Improving Learning of Fractions: https://sites.google.com/a/udel.edu/fractions/ administered at University of Delaware, focuses on improving math instruction for elementary and middle school children who have problems with math concepts, specifically fractions.  Research has been made possible via a grant from the Institute of Education Sciences.

Center for the Study of Teaching and Policy: http://depts.washington.edu/ctpmail/ at the University of Washington "investigates efforts to improve the quality of teaching and learning, the teacher workforce, and the systems of support for teachers’ work, in various contexts and at multiple levels of the K-12 educational system."

Center on Instruction: http://www.centeroninstruction.org/resources.cfm?category=math&subcategory=&grade_start=&grade_end  "The Center on Instruction offers materials and resources on mathematics to build educators’ knowledge of instruction for students with low achievement in mathematics, improve professional development models for math teachers, and build teachers’ skills in monitoring student growth toward important math outcomes."

Directory of Open Access Journals: http://www.doaj.org/ DOAJ aims to be comprehensive (i.e., all subjects and languages) and covers free, full text, quality controlled scientific and scholarly journals to guarantee content.  Quality control means that the journals "must exercise peer-review or editorial quality control to be included." This is a huge plus for researchers.

Education and Information Technology Library: http://www.editlib.org/ EdITLib, sponsored by the Association for the Advancement of Computing in Education, is a digital library of "peer-reviewed and published international journal articles and conference papers on the latest research, developments, and applications related to all aspects of Educational Technology and E-Learning."

Education Policy Analysis Archives: http://epaa.asu.edu/ojs/ a peer-reviewed online journal of education research.

Education Resources Information Center (ERIC): http://eric.ed.gov/ sponsored by the U.S. Department of Education, Institute of Education Sciences.  ERIC's search can be restricted to peer-reviewed only or full text articles.  Search by descriptors such as mathematics instruction, mathematics achievement, mathematics education, academic achievement, teaching methods, program effectiveness, and more.  You can search by source, author, publication date, publication type, education level, and audience. There is a Thesaurus that has multiple subcategories and related mathematical terms.

Educational Research Newsletter: http://www.ernweb.com/public/main.cfm "Since 1988, Educational Research Newsletter has informed educators of recent research on reading, math, behavior management and raising student achievement with brief reports on the most useful and relevant findings from leading journals and organizations."

Institute of Education Sciences: http://ies.ed.gov/ is the primary research arm of the United States Department of Education. IES brings rigorous and relevant research, evaluation and statistics to our nation's education system.

International Association for the Evaluation of Educational Achievement (IEA): http://www.iea.nl/ is an independent, international cooperative of national research institutions and governmental research agencies. IEA has conducted more than 23 research studies of cross-national achievement since 1958. Examples include Trends in Mathematics and Science Study (1995, 1999, 2003, 2007), the Progress in International Reading Literacy Study (2001, 2006), the TIMSS-R Video Study of Classroom Practices, information technology in education (SITES-M1, SITES-M2, SITES 2006), and the Teachers Education and Development Study in Mathematics (TEDS-M), initiated in 2005. Publications are available, as well as current studies (e.g., TIMSS Advanced 2008 and PIRLS 2011).

International TIMSS and PIRLS Study Center: http://timss.bc.edu/,  located at Boston College, is IEA's principle site for reporting on the Trends in Mathematics and Science studies and the Progress in International Reading Literacy studies.

K-12 Computing Blueprint Research Watch: http://www.k12blueprint.com/research-watch contains summaries that examine the research.  This is a good starting point to locating reports on the full studies in K-12 on topics such as one-to-one initiatives, e-learning, mobile devices, and technology effectiveness.

MERLOT: http://www.merlot.org/merlot/index.htm Find peer-reviewed online teaching and learning materials in numerous categories.  Education, and mathematics/statistics are among those.

National Assessment of Educational Progress (NAEP), the Nation's Report Card: http://nces.ed.gov/nationsreportcard/ "is the only nationally representative and continuing assessment of what America's students know and can do in various subject areas. Assessments are conducted periodically in mathematics, reading, science, writing, the arts, civics, economics, geography, U.S. history and beginning in 2014, in Technology and Engineering Literacy" (About section).  Representative samples of students from grades 4, 8, and 12 are selected for main assessments, but not all grades are assessed each time.

National Center for Education Statistics (NCES): http://nces.ed.gov/ is the primary federal agency for collecting and analyzing data related to education. Publications and annual reports are available.

National Council of Teachers of Mathematics: Research, News & Advocacy: http://www.nctm.org/news/default.aspx?id=160 Read summaries, clips, and briefs connectings math education research to the classroom.  You'll also find a list of books on research.

Promising Practices Network http://www.promisingpractices.net/default.asp For example, see Team Accelerated Instruction [TAI]: Math, a program rated as "promising."  TAI is a component of the MathWings program from the Success for All Foundation.  Probable implementers are elementary schools.  Other math programs (e.g., Everyday Mathematics, I CAN Learn Pre-Algebra and Algebra, Saxon Middle School Math, The Expert Mathematician, University of Chicago School Mathematics Project Algebra)  not yet fully evaluated by PPN, but highly valued by other credible organizations, are listed among "Screened Programs": http://www.promisingpractices.net/programs_topic_list.asp?topicid=21

Queuenews.com Education News and Research Reports: http://www.queuenews.com/ Select your state to read the latest news in education research for your state and the nation.

Rand Education: http://www.rand.org/education.html, a division of the Rand Corporation, includes numerous research areas with in-depth content available, which you can access via a drop-down menu at the top of home page.  For example, the section on Education and the Arts includes narrower topics such as academic achievement; education curriculum, policy, reform, legislation; educational technology, and if you expand the list to "Explore All Topics," you'll find mathematics (under M) and teachers and teaching (under T), and much more.  You'll find abstracts, many full journal articles, policy reports, research briefs, news releases, and more.

Society for Research on Educational Effectiveness: http://www.sree.org/ advances and disseminates "research on the causal effects of education interventions, practices, programs, and policies."  Three publications are supported:

U.S. Department of Education: http://www.ed.gov/  There is a section for research.  For a list of National Education Research and Development Centers, see: http://ies.ed.gov/ncer/RandD/ 

What Works Clearinghouse: http://ies.ed.gov/ncee/wwc/ Easily find what works among the many topics addressed and within publications and reviews.

 

Are you interested in the "No Significant Difference Phenomenon"?

Readers might be interested in The No Significant Difference website, which is a companion to Thomas L. Russell's book, The No Significant Difference Phenomenon: A Comparative Research Annotated Bibliography on Technology for Distance Education (2001, IDECC, fifth edition).  The book is a research bibliography documenting no significant difference in student outcomes based on the mode of education delivery (face to face or at a distance).  Note: by distance, we mean instruction delivered via media such as radio, television, video, online and other technologies, which have been used historically and in current studies.  In addition to studies that document no significant difference, the website includes studies which do document significant differences in student outcomes based on the mode of education delivery. Studies are listed from 1928 to the present with a search engine to find those of particular interest.

Are you interested in the current and future federal role in education research?  If so, the following document is just what you need.

The Aspen Institute. (2013, December 3). Leveraging learning: The evolving role of federal policy in education research. Washington, DC: Author. Retrieved from
http://www.aspeninstitute.org/leveraginglearning This document contains a "series of essays, infographics, and briefs that outline the current federal landscape of education R&D and envision its future." It begins with an essay titled A Brief History of Federal Efforts to Improve Education Research.  The section on the Current Federal Role in Education Research identifies research and development centers throughtout the U.S. and also provides an overview of investing in innovation.  Highlights within the section on the Future of the Federal Role in Education Research include Why We Need a DARPA for Education (ARPA-ED) and New Directions in Education Research: The Federal Role.

 

Back to top

 

References

Andrews, D. (2012, Winter). In search of feasible fidelity. Better: Evidence-based Education.  Retrieved from http://www.bestevidence.org/word/better_pages_4_5.pdf

Dick, B. (1998a). Rigour (1).  Occasional pieces in action research methodology, # 13.  Retrieved from http://www.scu.edu.au/schools/gcm/ar/arm/op000.html

Dick, B. (1998b). Grounded theory (2).  Occasional pieces in action research methodology, # 17.  Retrieved from http://www.scu.edu.au/schools/gcm/ar/arm/op017.html

Dick, B. (1997).  What is "action research"? Occasional pieces in action research methodology, # 2. Retrieved from  http://www.scu.edu.au/schools/gcm/ar/arm/op002.html

Ferrance, E. (2000). Action research.  Northeast and Islands Regional Educational Laboratory at Brown University: Themes in Education series.  Retrieved from http://www.brown.edu/academics/education-alliance/sites/brown.edu.academics.education-alliance/files/publications/act_research.pdf

Institute of Education Sciences & National Science Foundation. (2013, August). Common guidelines for education research and development. Washington, DC: Authors.  Retrieved from http://ies.ed.gov/pdf/CommonGuidelines.pdf  

Laitsch, D. (2003, August). Into the mix: Policy, practice, and research. ASCD InfoBrief, Issue 34. Available in Archived Issues: http://www.ascd.org/publications/newsletters/policy-priorities/aug03/num34/toc.aspx  

Mageau, T. (2004, January). Determining 'What Works'. T.H.E. Journal, 31(6), 32-27.

Mathematica Policy Research, Inc. (2007, April 5). New report released on congressionally mandated evaluation of 15 educational technology products. [Press release].  Available in press release archive: http://www.mathematica-mpr.com/newsroom/releases/archive_pr.asp

North Central Regional Educational Laboratory. (2004). NCREL quick key 7: Understanding the No Child Left Behind Act of 2001: Scientifically based research. Naperville, IL: Learning Point Associates. Retrieved from http://www2.learningpt.org/catalog/category.asp?SessionID=825996520&ID=13

Reeves, D. (2010). Transforming professional development into student results. Alexandria, VA: ASCD.

Reeves, D. (2006). The learning leader: How to focus school improvement for better results. Alexandria, VA: ASCD.

Simpson, R. L., LaCava, P. G., & Graner, P. S. (2004). The No Child Left Behind Act: Challenges and implications for educators. Intervention in School and Clinic, 40(2), 67-75.

Tillotson, J. W. (2000). Studying the game: Action research in science education. The Clearing House, 74(1), 31-34.

Viadero, D. (2007, July 12). U.S. gives What Works Clearinghouse to new contractor. Education Week, 26(43). Retrieved from http://www.edweek.org/ew/articles/2007/07/18/43whatworks.h26.html?tmp=541546481

 

Black line

Back to top Research in Education Corner: Page 1  |  2  | 

Binoculars GifGo to Related Topic:  State and National Education Standards and The Best Rated Standards Resources