Products  |  Resources   |   Methodology   |   How to Buy   |   Support   |   Training   |   About Merit   |   Contact Us

Home / News / Large Increase In Student Achievement With Merit Software, Study Says
ADDITIONAL LINKS:

Merit Software News

CONTACT:
Ben Weintraub (212-675-8567)

FOR IMMEDIATE RELEASE

Large Increase In Student Achievement With Merit Software, Study Says
New research shows unusually large effect on reading and language arts test scores.

A newly released study concludes that Merit's reading and writing intervention increases student achievement for randomly assigned groups who used the software. The Merit intervention was found to have an unusually large effect on students -- the magnitude of which has rarely been found in other educational research studies.

The study, which was conducted with 151 sixth and seventh graders at the Calhoun Middle School in Mount Zion, West Virginia, showed that students performed better on the state's standardized test when Merit was used in conjunction with regular classroom instruction for 24 weeks. Year-end test scores for Reading/Language Arts averaged 30 points higher than test scores for students in the control group.

According to the researchers, 37% more Merit sixth graders and 19% more Merit seventh graders passed the state's reading / language arts test when Merit was used as a supplement to in-class instruction. An effect size (Cohen d) of .94 was calculated for sixth graders' test scores.

Effect size is a statistical measure of the strength of the relationship between two variables. In this case, the two variables are intervention use and student gains. Effect sizes are a common denominator enabling the comparison of different studies in a single area of research.

There are different ways to calculate an effect size, but Cohen's d measure is the standard for educational researchers. Cohen defined effect sizes as "small, d = .2," "medium, d = .5," and "large, d = .8." The Cohen d effect size found for Merit sixth graders is of a magnitude that has rarely been reported.

Despite the No Child Left Behind legislation's call for scientifically based evidence, few products have been demonstrated to work through rigorous research -- that is, research conducted by credible third parties employing best practices such as randomized control groups, reasonable sample sizes and adequate time of use.

Merit began commissioning independent evaluations of its products in 2003. The company asked faculty at the Marshall University Graduate College in South Charleston, West Virginia to conduct several scientifically based studies on the impact of its educational software in West Virginia schools.

Prior to this study four evaluations had been conducted. The earlier studies examined the impact of Merit reading, writing and math software on students in grades 3 through 8. These studies had some shortcomings including the lack of random assignment of pupils and a short time frame of implementation. The longest evaluation lasted 9 weeks, less than a typical school year semester. The researchers, however, were able to make valuable observations. The studies showed that using Merit improved student achievement and raised standardized test scores. The studies also indicated that a more lengthy use of the software might show statistically significant gains for lower quartile students.

The purpose of the present study was to document results obtained with a more rigorous design. This study offered the opportunity for an analysis with larger sample sizes, random assignment of students and pairings based on previous levels of achievement. It also provided the opportunity for researchers to evaluate the use of Merit for an extended time-period of 24 weeks.

Based on the analysis of the researchers:

- Year-end scores for Merit Reading/Language Arts averaged 30 points higher on West Virginia's state standardized test (Westest) than scores for students in the control group.

- Thirty-seven percent more Merit sixth graders, and nineteen percent more seventh graders, achieved Mastery level, or higher, on the state's performance rubric than control group students.

- Test scores were also significantly higher for Merit students among Title I and female participants, with an 18 to 22 point advantage.

- An effect size of .94 was calculated for sixth graders' test scores, and .70 for seventh graders.

Merit provides individualized, context-sensitive, help throughout its software. Help is available to students in many forms while they use the software, whenever they want it, and as often they need it.

Using Merit, teachers are interrupted less frequently and have more class time to teach prepared lessons. In addition, the built-in tracking features help teachers discover just when they need to provide additional assistance to individual students.

To learn more about the research conducted on Merit Software visit: http://research.meritsoftware.com


ABOUT MERIT SOFTWARE:

Since 1983, Merit Software (www.meritsoftware.com) has been producing educational software that addresses the core competencies that students require to succeed. The emphasis is on strengthening students' ability to read, analyze data and communicate their ideas.

Merit Software's programs are known throughout the industry for providing context-sensitive tutorials for students and convenient record management features to aid teachers, tutors and parents.

Visit www.meritsoftware.com to examine the company's library of educational applications currently being used in thousands of educational facilities worldwide.

Merit's educational software has been the subject of rigorous, independent, scientific-based research. These studies have concluded that Merit's software provides an effective supplement to everyday instruction. Research findings are available on www.meritsoftware.com/research.

###


Share this:


For the latest views and opinions regarding education and technology, visit
Merit Dispatch.


© 2014 Merit Software | Privacy Policy | Site Map

"Merit," "Merit Software," and "Punch" are the intellectual property of Merit Software.
GED® and GED Testing Service® are registered trademarks of the American Council on Education.