Computing Technology for Math Excellence Logo

Math Topics

Learning Support

Professional

 

Black line

Papers

Cite any reference to the article below as:

Deubel, P. (2000). Selecting valuable software for the standards movement. Computing Technology for Math Excellence. https://www.ct4me.net/valuable_software.htm 

Black line

Selecting Valuable Software for the Standards Movement

Patricia Deubel, Ph. D.

November 30, 2000

Abstract:

According to Dr. Laurence Peters of the U.S. Department of Education, we should be assisting the entire nation's teachers to identify and use high quality materials on behalf of standards-based reform. Research indicates, however, that software selection is often placed on teachers, many of whom do not have expertise or time to do so. This article presents twenty guidelines for judging the instructional and technical merit of software for the K-12 standards movement. Guidelines should help technology coordinators, curriculum directors, administrators, and teachers to select valuable software that can lead to achievement gains for students.

Contents:

I. Why are software selection guidelines needed?

II. Guidelines

A. Instructional Merit
B. Technical Merit

III. Concluding Remarks
IV. References
V. Contributor

Back to top

Why are software selection guidelines needed?

In a 1999 national survey by Education Week, only 12% of teachers reported that their state or district provided lists of software titles that match curriculum standards. The pressure to satisfy curriculum requirements, particularly in states with specific academic standards and high stakes tests, adds to the difficulty of finding appropriate digital content. Unfortunately, many teachers do not know where to turn to find out which digital content is aligned with their curricula, nor do they have the time or expertise to do so (Fatemi, 1999). The situation seems unacceptable in an age where technology use and national standards testing are at the forefront of education.

According to Dr. Laurence Peters (2000) of the U.S. Department of Education, we should be assisting the entire nation's teachers to identify and use high quality materials on behalf of standards-based reform. A systemic approach to technology selection helps save time and effort in ensuring quality control, and frees teachers to spend more time teaching.

Recently, I conducted a survey study of teachers from 13 Ohio urban school districts, which examined the use and effectiveness of software to help students pass a standardized test required for high school graduation. Software quality was a significant factor affecting teachers' decisions to use technology in their instruction (Deubel, 2000). Weaknesses in software they used led to the development of twenty guidelines to judge the instructional and technical merit of software. The guidelines presented in this article should help K-12 technology coordinators, curriculum directors, administrators, and teachers, according to Dr. Peters, to harness the power of educational technology to advance the standards movement.

Back to top

Guidelines

Instructional Merit

  1. Has the software been correlated with national and state standards, and learning objectives for the proficiency test that students will take?

  2. A product that has been correlated to learning objectives reduces teachers' time to identify and select valuable software. Software companies (e.g., The Learning Company; Lindy Enterprises, Inc.; Riverdeep Interactive Learning) often place information about correlation to state and national standards in product descriptions and make them available to customers upon request.

  3. Is the software available above Version 1.0? New versions often have not been debugged and should be avoided (Abramson, 1998).

  4. Does the software have stated learning objectives that are adhered to?

One way to determine this is to check that reward systems are tied to learning events. For example, Soloway and Norris (1998) criticized Math Blaster (1997) because students get to play a shoot-'em-up game having nothing to do with what they just learned as a reward for success. Ideally, valuable software would address objectives that help students to master basic skills and to foster higher level thinking skills.

  1. Is the software motivating to students?

  2. Both teachers and students should preview software. Some teachers in my study found software too difficult and unsuitable for ability levels of special needs students. Students quickly became bored and said that they did not understand the software.

    Middle and high school students want software that is educational, entertaining, and fast (Tammen & Brock, 1997). One middle-school administrator I interviewed was concerned about their educational software's inability to satisfy students' Gameboy expectations. Consequently, he believed the school was not having much success with software as a tool for proficiency intervention.

  3. Does the software allow for individualized instruction?

  4. Individualized instruction supports the two most important conditions for active mental engagement: the intensity of motivation to learn and the quality of the instructional support for learning. Unlike standardized approaches to learning that hold time constant and allow achievement to vary, customized instructional processes that are possible with technology tools permit students to work on standards until they are met (Reigeluth, 1997). Look for extensive help features so that students can work independently with software.

  5. Does the software suggest paths to improve and have the ability to automatically adjust for student needs?

  6. Students could develop independent and reflective thinking and learning skills, if software incorporates scaffolding features. Look for guiding, coaching, and modeling messages as Stop reminding me or Show me an example. Software might elicit information from learners that requires them to think about tasks and complete form fill-ins to enter responses to questions about subtasks. It might contain scaffolding that is like training wheels on a bicycle. For example, defaults would enable novice learners to use only the simplest of tools available. More advanced features would be revealed as learners gain expertise. Learners would control turning on or off more advanced features that were previously hidden with computer assistance in decision-making (Jackson, Krajcik, & Soloway, 1998).

  7. Does the software have more than one entry level? More than one level of difficulty?

  8. This feature permits students to work on only those content modules they need and those at their skill level, whether it is practicing basic skills or developing critical thinking skills at an applications level.

  9. Does the software provide clear examples of skills that it is designed to develop?

  10. Check how software helps learners build conceptual understanding of problem-solving processes. Is there a balance between drill and practice, computation, and factual recall and open-ended problem-solving processes that explore higher level concepts? Applications also might include screens that summarize the major aspects presented about a topic before moving on to a new topic. This is important because proficiency tests often contain a balance of multiple choice questions and constructive response that test not only facts and basic knowledge, but also the application of knowledge to problem solving.

  11. Does the software provide some repetition to assist in retention?

  12. Multimedia can provide two ways for learners to rehearse information. For a simple rote repetition, text is accompanied by a voice-over repeating the text to be learned. Information can be rehearsed more elaborately if learners can enter alphanumeric responses to exercises that require them to apply knowledge in an appropriate context (Vilamil-Casanova & Molina, 1996).

    Practice exercises should be placed after presenting a subject to reinforce learning by transferring the information from working memory to long-term memory. Sometimes a gap between a question and its related content will force learners to mentally search for and review information, a process that enhances retention (Thibodeau, 1997).

  13. Can students change answers before software grades assessments?

  14. Consider students who normally skip questions on a test, answer those they know first, then return to those they left out or rethink. Technology testing practices should mirror paper-pencil practices. Look for software that not only grades assessments, but also allows students to review questions that were missed. Software provides explanations for those questions and additional questions for practice on concepts that were missed.

Back to top

Technical Merit

  1. Is there a teacher management system that permits teachers to:

One teacher abandoned software use because the software could not individualize instruction. She stated that all of her students were on different levels, and needed individual assignments. Ideally, a management system would enable content to be selected in three different ways. In student mode, learners can select their own lessons. In teacher mode, teachers can specify an exact sequence of lessons. Computer mode uses diagnostic tests to determine strengths and weaknesses, and then creates customized, prescriptive lesson assignments for students. Skillsbank4 (1998), for example, offers these features.

Examine teacher guides or tutorials that accompany the software. Look for explanations of management tasks that teachers typically would perform. Tasks might include use of passwords, setting up or removing classes, adding or deleting individuals from classes, transferring data to or from student disks to teacher workstations, and preparing data reports for one student or an entire class. Does a manual or the software itself help first-time users easily install the software?

  1. Can data be saved, so that if students are not finished with a lesson, they do not have to begin again?

  2. Many classes in public schools meet for only 40 minutes per day. Avoid software that includes single assessments that can not be completed during class time and which does not allow that data to be saved. It can be very frustrating for students and teachers, plus a waste of class time, if students must repeat what they have already done.

    Some programs automatically save data to student disks (e.g. SkillsBank4, 1998). Students should have a clear message when data has been successfully saved. This gives them a sense of relief, and indicates that it is clear for them to move on to other actions. For long delays in saving data, look for a message showing the progression of percent saved to disk (Shneiderman, 1998).

  3. Is there an extensive database of problems, so that upon repeated use of software, students encounter a different set of problems?

  4. Avoid software that comes with a set of only 40 questions, for example. The software is of little value once everyone has used it. Determine if software offers a different set of problems for both practice examples and assessment questions.

  5. Do problems make reference to real-life applications?

  6. To increase transfer, knowledge should be anchored in realistic contexts and settings. For example, The Adventures of Jasper Woodbury, which was developed at Vanderbilt University, has been used by mathematics students to solve problems that the character Jasper encounters as he ventures up Cedar Creek to buy a new boat. Students using Jasper improved math concepts, solved word problems of various levels of complexity, planned problems, and showed less math anxiety (Litchfield, 1993).

  7. Does software accommodate more than one solution method?

  8. Only 5% of surveyed teachers who used software in their instruction indicated that their software accommodated multiple approaches to solutions (Deubel, 2000). Teachers should judge the merit of this option based on whether their students are novice or advanced learners, however. According to Tergan (1997), there is high probability with multiple representations that at least one of them will be misunderstood. This could hamper an overall understanding of the material, particularly for novice students. Only advanced learners with a high level of domain knowledge and metacognitive competence may benefit from multiple representations.

  9. Is feedback tutorial in nature, or does feedback just indicate if responses are right or wrong?

  10. Feedback should provide occasional motivational messages, as well as information about the correctness and/or appropriateness of a response. It should be on the same screen with the question and student response to reduce the memory load on students, should provide hints and ask students to try again if answers are incorrect, and be tailored to the response. Feedback should not encourage students to answer incorrectly just to see feedback (Orr, Golas, & Yao, 1994). Users should not be trapped in a failure cycle, however. After two attempts, the program should provide the correct response and indicate why an answer was wrong. Rewards for a correct response, such as words of praise, award ribbons, or animation, should be appropriate for the activity (Abramson, 1998).

  11. Are navigation icons well designed?

  12. Visuals and icons should be culturally sensitive, particularly if the product is to be used in divergent cultural contexts (McFarland, 1995). Each icon should be clearly distinguishable from the next and chosen to represent accompanying text. Icons should stand out from their background. Consider size of icons. Young students or those with hand-eye coordination problems may not be good mousers and may require icons about one-inch in size.

    Note the placement of navigation elements. Consistently placed navigation elements make a program easier to use, add structure, and provide learners with control over events. Typical icons include elements to quit the program, access the next or previous screen, obtain help, use a glossary, or go to the main menu. Kenworthy (1993) suggested that terms like return, exit, load, enter, or cancel might confuse some students. For example, if designers use the term return, poor readers may not know where they have come from in the program.

    Graphic icons or still photos can be used to illustrate menu choices. Text at menu choices might be highlighted to indicate that a learning module has been selected or completed. Beware of color choices used for highlighted sections of a program. Blue often is used to signify a hyperlink, and selected text often changes color. Students and teachers may become confused by color used inconsistently.

  13. Does the software contain multimedia features?

  14. While the debate about the effectiveness of multimedia continues, it is true that interactive multimedia cannot guarantee learning any more than a school library can. Multimedia appeals to different learning styles and becomes the vehicle through which students control learning events and monitor their progress.

    As novelty of using multimedia wears off, it becomes more important for software to contain motivation elements of the ARCS (attention, relevance, confidence, and satisfaction) model to maintain student interest. For example, the relevance of instruction may need to appear in the software as specific statements of the use of a skill or knowledge. Informing students of goals and objectives and giving students frequent and early opportunities for success can incorporate confidence within the multimedia program. Embedded questions, scoring, self-checks, and practice questions are good methods to use to increase confidence (Litchfield, 1993).

    Graphics should be age appropriate. For example, middle school students do not want to deal with elementary school related images or icons. Students notice the use of real people as opposed to cartoon characters and are critical of font size, use of color, buttons that do not work, annoying sound, and users' guides that do not answer their questions (Tammen & Brock, 1997).

    The amount of information presented on a screen depends on age and grade level of learners. Illustrations should match the intended audience's cognitive perspective because some illustrations might mean different things to different audiences. Text and visuals should complement each other, offering different yet related information to promote learning (McFarland, 1995).

  15. Are help and audio features under user control?

  16. One software developer I interviewed omitted sound from the software because of distractions it might cause some students and wondered if sound serves as reinforcement to all students. The issue is that help and audio should be under learner control to toggle it on/off. Audio should be linked to the learning activities, not just provide an unrelated musical background for the sake of having sound. In a classroom setting, it may be necessary to purchase headphones.

    Many students who do poorly on proficiency tests are the same ones who read poorly. Kenworthy (1993) noted that poor readers benefit from multiple media because they often get their information from television, so the mix of moving video, audio, and high quality graphics may grab their attention in ways that traditional approaches to instruction would not. Audio can explain menu choices, which can be highlighted as explained. Audio can be interrupted when learners are ready to make a selection. Audio that supports text should match the text exactly so that learners may identify unfamiliar words. Learners should be able to pause or repeat audio, as well as repeat text passages.

  17. Does the software have a security system so that student errors or intentional attempts to disrupt software operation do not disrupt software?

  18. Teachers want software security, as only 28% in software users in my study indicated that student errors or intentional attempts to disrupt software operation could not disrupt software. Check that security is available that prevents student access to teacher-only information, including student data.

Back to top

Concluding Remarks

Software specific to the purposes of basic skills achievement, availability of computers, teacher training and involvement in implementation decisions, positive student and teacher attitudes toward computers, and time spent using software lead to achievement gains. Drill and practice software can make a difference in achievement and may lead to even greater achievement when combined with newer technologies that focus on constructivist and higher-order thinking skills applications (Mann, Shakeshaft, Becker, & Kottkamp, 1999).

In reference to proficiency improvement, Riel (NECC, 1999) said that the lower the initial scores, the more effective technology is in raising test scores. Rarely does the introduction of information and communication technology into the classroom have the effect of decreasing test scores. Advanced technologies like multimedia lessons must be an integral part of a course, however, to achieve maximum impact on students. The one day per week compromise between not requiring computers and requiring them all the time does not work (Usiskin, 1993), a conclusion supported by my study.

Teachers clearly pointed out the need for drill and practice software for students who were failing the proficiency test because they lacked such skills. Unfortunately, as Hirsch (1999) noted, drill and practice has a negative connotation as a tool to teach skills and runs contrary to the discovery learning and project movement. The method should not be slighted as low level, however, because it is just as essential to complex intellectual performance as drill and practice are to the virtuoso violinist or the athlete on the playing field.

Valuable software that can lead to achievement gains when used regularly to individualize instruction has a price tag. Schools are only using software that comes bundled with computer purchases. The situation is not acceptable because software is a key component, if technology is to really impact education of all children. The right software could genuinely afford them the opportunity to engage deeply and substantively in ideas and collaborations (Soloway, 1998). Hopefully, these guidelines will help teachers to identify the right software for the standards movement.

Back to top

References

Abramson, G. W. (1998). How to evaluate educational software. Principal, 78(1), 60-61.

Deubel, P. (2000). Mathematics software and achievement on the Ohio Ninth Grade Proficiency Test (Doctoral dissertation, Nova Southeastern University, 2000). Dissertation Abstracts International, Publication Number 9981161.

Fatemi, E. (1999, September 23). Building the digital curriculum: Summary. Available in Education Week Technology Counts Archive: https://www.edweek.org/ew/articles/1999/09/23/building-the-digital-curriculum.html  Last accessed March 18, 2020. [Note URL update of this article.]

Hirsch, E. D., Jr. (1999). The schools we need and why we don't have them. New York: Doubleday. ISBN: 0-385-49524-2.

Jackson, S., Krajcik, J., & Soloway, E. (1998). The design of guided learner-adaptable scaffolding in interactive learning environments. http://www.umich.edu/~hiceweb/papers/misc/design_of_guided/index_site.htm   [Note: URL updated July 28, 2011 since publication of original article].

Kenworthy, N. (1993). When Johnny can't read: Multimedia design strategies to accommodate poor readers. Journal of Instruction Delivery Systems, 7(1), 27-30.

Litchfield, B. (1993). Design factors in multimedia environments: Research findings and implications for instructional design. Paper. Annual Meeting of the American Educational Research Association. Atlanta, GA, April 12-16: American Education Research Association. (ERIC Document Reproduction Service No. ED363268).

Mann, D., Shakeshaft, C., Becker, J., & Kottkamp, R. (1999). West Virginia Story: Achievement gains from a statewide comprehensive instructional technology program. Santa Monica, CA: Milken Family Foundation and Charleston, WV: West Virginia State Department of Education. ERIC Document Reproduction Service No. ED429575. https://eric.ed.gov/?id=ED429575    [URL updated August 30, 2019.]

Math Blaster [Computer software]. (1997). Knowledge Adventure, 4100 West 190th Street, Torrance, CA 90504. http://www.knowledgeadventure.com Accessed December 29, 1999. Last accessed January 26, 2009.

McFarland, R. D. (1995). Ten design points for the human interface to instructional multimedia. THE Journal, 22(7), 67-69.

NECC '99 spotlight on the future: Keynote interviews. (1999). THE Journal, 26(11), 58-64.

Orr, K. L., Golas, K. D., & Yao, K. (1994, Winter). Storyboard development for interactive multimedia training. Journal of Interactive Instruction Development, 18-29.

Peters, L. (2000). A third millennial challenge: Harness the power of educational technology to advance the standards movement. THE Journal, 28(2), 95-102.

Reigeluth, C. (1997, November). Educational standards: To standardize or to customize learning? Phi Delta Kappan, 79, 202-206.

Shneiderman, B. (1998). Designing the user interface. Strategies for effective human-computer interaction (3rd ed.). Addison-Wesley. ISBN: 0-0201-69497-2.

SkillsBank (Version 4) [Computer software]. (1998). Achievement Technologies, Inc. School Division: 10400 Little Patuxent Parkway, Suite 310, Columbia, MD 21044. http://www.achievementtech.com/  [Address and URL updated January 26, 2009.]

Soloway, E. (1998). No one is making money in educational software. Association for Computing Machinery, Communications of the ACM, 41(2), 11-15.

Soloway, E., & Norris, C. (1998). Using technology to address old problems in new ways. Association for Computing Machinery, Communications of the ACM, 41(8), 11-18.

Tammen, J., & Brock, L. (1997). CD-ROM multimedia: What do kids really like? Multimedia Schools, 4(3), 54-59.

Tergan, S. (1997). Multiple views, contexts, and symbol systems in learning with hypertext/hypermedia: A critical review of research. Educational Technology, 37(4), 5-18.

Thibodeau, P. (1997). Design standards for visual elements and interactivity for courseware. THE Journal, 24(7), 84-86.

Usiskin, Z. (1993). Lessons from the Chicago mathematics project. Educational Leadership, 50(8), 14-18.

Vilamil-Casanova, J., & Molina, L. (1996). An interactive guide to multimedia. In Que Education and Training. (pp. 124-129). ISBN: 1-57-576066-5.

Back to top

Contributor:

Patricia Deubel (pdeubel@ct4me.net) earned a Ph.D. in Computing Technology in Education from Nova Southeastern University in Fort Lauderdale, Florida. She has over 25 years experience in mathematics and computer education teaching, teacher training, staff development, and curriculum development and has presented computer workshops at the state and local levels. She has been an adjunct professor in the School of Computer and Information Sciences at Nova Southeastern University. She also has taught mathematics at The Ohio State University at Mansfield, Ohio.  Other recent articles appear in the Ohio Journal of School Mathematics, the Journal of Instruction Delivery Systems and HyperNexus: Journal of Hypermedia and Multimedia Studies.

 

Back to top