DO CLICKERS OPEN MINDS? USE OF A QUESTIONING STRATEGY IN DEVELOPMENTAL MATHEMATICS by Nancy A. Moreau BARBARA LEWIS, Ph.D., Faculty Mentor and Chair BRUCE FRANCIS, Ph.D., Committee Member SUZANNE DUNN, Ph.D., Committee Member Barbara Butts Williams, PhD , Dean, School of Education

A Dissertation Presented in Partial Fulfillment Of the Requirements for the Degree Doctor of Philosophy

Capella University December, 2009

© Nancy Moreau, 2009

Abstract The purpose of this research was to determine if there were differences in academic performance between students who participated in a traditional mathematics class compared to a class which incorporated an explicit questioning strategy with audience polling devices, or “clickers.” This study utilized a quantitative quasi-experimental design to determine the significance of differences in pre- and posttest performance between the students who participated over a college semester in 2009. There were 113 student participants (n = 113) enrolled in a Pre-Algebra course at the research site who agreed to participate in this study. A total of 58 participants were assigned to the experimental group who participated in instruction incorporating an explicit questioning strategy with clickers along with traditional lecture methods. The other 55 students were assigned to the control group who participated in a traditional lecture. Both courses were taught by experienced professors who have qualifications at the master’s level. Academically, the two groups were equivalent in terms of their academic achievement at the start of the study. After the study, the data indicated statistically significant differences ( p < .05) in academic performance between students who were taught with an explicit questioning strategy using clickers students and those who participated in lectures without clickers. Overall, the experimental group scored higher on the posttest than the control group, and weak students in the experimental group made more improvement as measured by the posttest than similar ability students in the control group. The statistical analyses indicated no significantly different average academic performance in either group as delineated by ethnicity, gender or part-time/full-time status.

Dedication This dissertation is dedicated to a number of important people in my life. First, I dedicate this to my children, David and Deborah, and my grandchildren, Logan, Lauren, Maria and Patrick. I hope this work will inspire them to value education, commitment and hard work necessary to pursue their own quest for lifelong learning. Second, to my husband, who has always remained at my side supporting my journey in every way possible. And last, to the loving memory of my parents, Rose and Cyril Harvilchuck, who always believed in me.

iii

Acknowledgments

  The journey has been exciting and rewarding. I want to thank my current mentor, Dr. Barbara Lewis, for guiding me in the right direction and allowing me to take control of my own destiny. I could not have completed the process without the valuable input of both committee members, Dr. Suzanne Dunn and Dr. Bruce Francis from Capella University. I especially wanted to thank Dr. Nan Thornton who was always there to provide extra support and encouragement as my committee evolved during the early days of the dissertation. On a more personal note, first, I would like to thank my husband, Wayne, for being there for me and picking up my slack with the household when I was trying to meet deadlines. I also want to thank my children, Deborah and David, for their continued support and kind words throughout this entire process. I also want to thank the participants in my study, both the instructors and the students. Lastly, I want to thank my department, the dean and my associates at Northampton Community College who offered support and encouragement throughout the process. I especially want to thank Dr. Jill Hirt, Director of Institutional Research at Northampton Community College, for not only supplying some of the demographic data I needed but also volunteering to read and offer suggestions on my statistical analysis. Thank you, each and everyone.

iv

Table of Contents Acknowledgements

iv

List of Tables

viii

List of Figures

ix

CHAPTER 1. INTRODUCTION

1

Introduction to the Problem

1

Background of the Study

2

Statement of the Problem

6

Purpose of the Study

7

Rationale

7

Research Question

9

Significance of the Study

9

Definition of Terms

11

Assumptions and Limitations

12

Nature of the Study

13

Organization of the Remainder of the Study

14

CHAPTER 2. LITERATURE REVIEW

15

Introduction

15

Current Clicker Research

16

Active Engagement Research

20

Research on Formative Assessment

22

Research on Explicit Teaching

27

Background Information on Clickers

32

v

Integrating the Research into Instructional Design

35

Summary

42

CHAPTER 3. METHODOLOGY

44

Purpose of the Study

44

Research Question

45

Research Design

45

General Format of the Class

49

Tests

52

Quizzes

53

Data Collection

55

Data Analysis

55

Threats to Validity

56

Ethical Considerations

59

Summary

60

CHAPTER 4. DATA COLLECTION AND ANALYSIS

62

Introduction

62

Demographics

63

Descriptive Statistics

64

Inferential Statistics and Hypothesis Testing

69

Impact of Weak Students

80

Summary

82

CHAPTER 5. RESULTS, CONCLUSIONS, AND RECOMMENDATIONS Restatement of Problem

84 84

vi

Review of Literature

85

Review of Methodology

87

Research Question

88

Relating the Findings to the Research

92

Limitations

95

Recommendations for Further and Future Research

96

Conclusion

98

REFERENCES

100

APPENDIX A. COURSE OUTLINE SYLLABUS

108

APPENDIX B. LESSON PLANS

112

APPENDIX C. PRE/POST TEST

135

vii

List of Tables Table 1. Verification of Content Validity

58

Table 2. Demographics of Study Participants and Enrolled Students

63

Table 3. Pretest/Posttest Summary of Data

64

Table 4. Descriptive Statistics for Quizzes

66

Table 5. Descriptive Statistics for Posttest (PT) Quizzes

68

Table 6. Repeated One-Way ANOVA Table for Individual Posttest Groups

71

Table 7. Independent Samples t test between Groups for Each of the Quizzes

72

Table 8. Independent Samples t test for Posttest Unit Quizzes

73

Table 9. Tests of Normality

75

Table 10. Mann-Whitney U Test Results

77

Table 11. Kruskal-Wallis Test Results

77

Table 12. Comparing Group, Gender, Ethnic and Full-time/Part-time Status

79

Table 13. Tests of Within Subjects Effects Using the Measure: Groups

80

Table 14. Percent Passing the Final Compared to Pretest Grades

81

viii

List of Figures Figure 1. Turning Point RF clicker and receiver

35

Figure 2. An instructional design pyramid

36

Figure 3. A comparison of pretest and posttest mean scores

65

Figure 4. Mean quiz scores for eight semester quizzes. For experimental group, n = 58 and for the control group, n = 55

67

Figure 5. Mean posttest scores for eight units. For the experimental group, n = 58, for the control group, n = 55

69

Figure 6. Percentage of students passing posttest versus pretest score

82

Figure 7. Comparison of semester quizzes and posttest unit quizzes for control group

90

Figure 8. Comparison of semester quizzes and posttest unit quizzes for experimental group

91

ix

CHAPTER 1. INTRODUCTION

Introduction to the Problem During most of the twentieth century, the United States possessed peerless mathematical power (U.S. Department of Education, 2008). Information from the National Mathematics Advisory Council’s final report in March 2008 paints a bleak picture of Mathematics Education in the United States in the twenty-first century. According to the report, the United States faces accelerating retirements in the science and engineering workforce, even as the growth of job opportunities increases. These trends put pressure on the United States to sustain a workforce which is of adequate scale and quality. For years, the country has imported great numbers of talented individuals from other countries, but the successes of global economics fostered by the Internet is keeping that talent on foreign soil. The competitive consequence of losing independence and leadership in mathematics, natural sciences and engineering is twofold. First, it reduces the nation’s ability to be competitive in a global environment. Second, it threatens the foundations of our national security. The report recommends that national policy must insure the healthy development of a domestic technical workforce with toplevel skills. Even the individual citizen is impacted, since his/her mathematical aptitude gives him/her career choices, college options and increases future earnings. A sound education in mathematics across the population is vital to our national interest and survival (U.S. Department of Education, 2008). 1

Background of the Study Data from recent college research indicates that more than 90% of students who take the math placement tests at the community college require at least one level of developmental math (Achieving the Dream Proposal, 2007). Developmental mathematics is a noncredit college course designed to teach or refresh the skills needed to succeed in a college mathematics course. Students who are required to enroll in developmental mathematics courses are encouraged to take these courses during their first enrolled term at the college. Currently most traditional college math courses are taught in the lecture format. Traditional mathematics instruction is based on a teacher-centered approach where demonstration is followed by practice (Goldsmith & Mark, 1999). In a math class, a professor provides direct instruction through lecture and example, and then the students are given exercises that require them to repeat the task which was learned in class (Jackson & Neel, 2006). This traditional approach is not effective for many students as indicated by the high level of math failure in college math courses. The failure rate for developmental mathematics courses in community colleges across the country averaged at 48% for the fall 2007 term (Achieving the Dream Data, 2008). The failure rate, as defined by the Achieving the Dream Initiative (AtD) is determined by the ratio of the number of students who received a C- or lower in the course, or withdrew from the course with a W, to the original number of enrolled students. Although faculty, staff, and administration at community colleges are concerned about low student achievement, collectively they have not been able to achieve significant improvements in success rates. As part of the Achieving the Dream initiative, colleges across the country are making commitments to explore strategies that will 2

improve student success (Achieving the Dream Data, 2008). It is critical that these students obtain the skills necessary to continue their math courses at the college level, since many programs require at least one college level math course for a certificate or diploma. Improving Student Learning American colleges and universities have been continuously challenged to increase access to higher education, and improve the quality of student learning and reduce costs. In 2003, the National Center for Academic Transformation (NCAT, 2008) developed a Roadmap to Redesign (R2R) program which provided a model to improve learning in higher education. Although R2R project focused on large-enrollment, introductory courses, the components of the project were applicable to all students. The components of this program included emphasis on the use of computer laboratories to encourage active learning, while continuing the assessment of student progress (Thiel, Peterman & Brown, 2008). Twenty-five of the thirty colleges which partnered with NCAT on the R2R project showed a significant increase in student learning. Since computer laboratories are very expensive and not practical to outfit in every classroom, auxiliary techniques to promote student comprehension have evolved. One technique which is much less expensive is the audience polling system. Learner-Centered Focus Reigeluth and Carr-Chellman (2009) described a new paradigm of education focused on learning and helping everyone to reach their potential. They discussed the shift from passive learning to active learning; from teacher initiative, control, and 3

responsibility to shared initiative, control, and responsibility. There are a number of methods to achieve a learner-centered focus. The method to be used in this study is a modification of the Cooperative Group Learning model presented by Molenda in Reigeluth (1999). Instead of putting the problem after the learning activities, this strategy places the problem at the center of the learning activity. Students can then tackle the problem as a group, but respond to the question individually using audience polling devices or “clickers.” Not only will the student get immediate feedback as to the correctness of the answer selected, but the teacher will know if the concept needs more explanation. The student responses will then control the pace of the lesson, and the classroom will immediately become more learner-centered. A challenge for course designers for undergraduate mathematics course is to be able to design campus courses in order to increase student engagement while still allowing the same amount of content to be covered (Harniss, Carnine, Silbert, & Dixon, 2002). The U.S. Department of Education (2008) has recommended some explicit instruction regularly for students who have mathematical difficulty. For explicit instruction, the Department designates that teachers provide an array of clear models for solving each problem type. The students should receive extensive practice on the newly learned skills. The students should also be provided with extensive feedback on their progress. Since class time is very limited in a college mathematics course, it is impractical to assume that an instructor can implement extensive activities that require a significant amount of time without reducing the amount of material covered (Draper & Brown, 2004). Polling devices offer a possible solution.

4

Audience Polling Devices A”Clicker” is the generic name for an increasing number of audience polling devices used in a class to answer multiple-choice questions during class. The idea developed from classroom activities such as showing hands or cards. A disadvantage for showing hands or cards is that students’ responses are easily known by the rest of the class. Students who did not know the answer, or were embarrassed by making a mistake did not participate. To individually engage all students, classroom polling systems such as Class-Talk emerged in 1985. For a decade, the original system had numerous technical problems but over the years the continued development has eliminated most operational problems and reduced the overall cost of the systems. Currently, there are a number of commercially available systems such as Turning Point® (Turning Technologies, 2008), Personal Response System (Interwrite, 2008) and E-instruction (E-instruction, 2008). A number of publishers have adopted one exclusive system and offered supplemental software as a teaching aid in support of their textbooks. Audience polling systems have been integrated into science, business and some other areas higher education (Penuel, Abrahamson & Roschelle, 2004). Audience polling devices are used by students to answer multiple-choice questions in an increasing number of college classrooms. However, most of the use has been sporadic, and limited to surveys, competitions, opinion polls and nomenclature (Mayer, 2008). As this new technology becomes adopted, one challenge facing instructional designers is to create a strategy using the polling systems to improve achievement. The questioning technique used in the present study was modified from work originally done in the physics department of Ohio State University (Li, 2007). Unlike the original work 5

which was done with academically well prepared physics students at a four year institution, this study looked at a strategy to address the needs of developmental mathematics students at a community college. The majority of the research conducted on clickers was qualitative and studied the students’ engagement and attitudes towards using the clickers. There have been few empirical measurements on the student achievement associated with the clicker devices (Mayer, 2008; West, 2005). Most of the quantitative research on the use of clickers in the classroom has been done at large universities in college level courses in science, business and college algebra. Statement of the Problem Interactions among students and/or faculty and engagement with content are important components of the educational experience for all students (Halpin, 1990; Tinto, 1993, 2005). Research literature shows that isolation, and a lack of interaction and engagement exists in traditional classes (Halpin, 1990; Tinto, 1993) in four-year colleges and community colleges. Tinto (2005) stated the lack of interaction in courses seemed to contribute to lower achievement in those courses. Studies show that clickers are effective in increasing both interaction and engagement in the classroom (Draper& Brown, 2004; Wood, 2004, Campbell, 2007; Li, 2007). However, existing literature on the use of personal response systems in developmental math courses is sparse in revealing the effects that engagement has on achievement (Li, 2007), and anecdotal in nature (Roschelle, Panuel and Abrahamson, 2004). It is not known to what extent, participation in an explicit questioning strategy using clickers affects student achievement. This study 6

investigated the effect of an explicit questioning strategy using clickers on student achievement. The study focused on pre-algebra courses at a branch campus of a Pennsylvania community college. Purpose of the Study This study determined whether an embedded strategy of explicit instruction and questioning using clickers results in higher mathematical achievement of developmental mathematics students in a specific community college environment. The achievement was measured by pre and posttests and chapter quizzes. Rationale Mathematics literacy has been identified as a serious problem in the United States. By the twelfth grade, less than 23% of the students in the United States are proficient in mathematics (U.S. Department of Education, 2008). Phillips (2007) reported that 78% of adults could not explain how to calculate interest on a loan, 71% could not calculate miles per gallon on a trip, and 58% could not calculate a 10% tip for a lunch bill. Upon entering college, the prospective students are required to demonstrate competency on SAT scores, state exams or college placement tests. Many of these prospective students are entering the community colleges across the country as both traditional (just graduated from high school) and nontraditional students (out of school a number of years). American colleges and universities have been continuously challenged to increase access to higher education, and improve the quality of student learning and reduce costs. Although the emphasis on the use of computer laboratories to encourage student success 7

at the university level, the cost and inappropriateness of the technique for a large number of students has encouraged colleges to search for other strategies to accomplish the same outcome (Thiel, Peterman & Brown, 2008). Clickers offer the option of classroom interaction and engagement, and universities are rapidly adopting them to increase student achievement. In a review of the research, Roschelle, Panuel and Abrahamson (2004) stated that both student achievement and participation levels have been reported to improve when classroom response systems are implemented. However, Roschelle, Panuel, Crawford and Shechtman, (2004) determined that these effects have not been tested with an appropriate experimental design. In their report on Advancing Research on the Transformational Potential of Interactive Pedagogies and Classroom Networks, they encouraged effectiveness and implementation research. Roschelle, Panuel, Crawford and Shechtman, (2004) identified researchers and educators who have been studying the audience participation devices. These individuals come from many different fields and do not share common methods, theories or social networks. As a result, this community of users has not produced the effectiveness or implementation research which is currently needed. By comparing classrooms that utilize the explicit clicker questioning strategy with similar classrooms that do not incorporate this strategy, the instructors can determine the effectiveness of the explicit clicker strategy in engaging students and increasing student achievement. This study replicated the questioning portion of the Li (2007) study using a different subject matter and a different population. The results of this study attempt to expand the body of knowledge in determining the effectiveness of a clicker questioning

8

strategy with developmental mathematics students. At the present time, there was no reference to this population or course level in the literature. While previous studies have indicated some benefits to using the clicker technology in the classroom, there is little research on the specific use of clicker questioning strategies in developmental mathematics courses at any level (Arithmetic, Pre-Algebra, Introductory Algebra, and Intermediate Algebra). Current literature reveals a gap in the knowledge which this study addressed. Therefore, the lack of quantitative results suggested that further study of questioning strategies using the audience response devices was warranted. Research Question This study examined the effect of explicit instruction strategies using clickers in a developmental mathematics course. The primary research question was To what extent does the use of clickers as an instructional strategy impact students’ level of achievement of pre-algebra skills? Significance of the Study Hiltz and Goldman (2004) indicated that students must be given additional opportunities to interact with the material in order to produce a deeper level of understanding than the traditional lecture and rote practice assignments. Clickers may be able to provide the students with that engagement by presenting a nonthreatening, anonymous opportunity to participate in the class discussion. Correctly answering a question builds a student’s self esteem. Incorrectly identifying the answer alerts the 9

student and the teacher to of a lack of understanding. Since the plan of this study was to present all concepts twice, once in the lesson and once in the review, the student was instantly aware of his/her progress, or lack of progress in a specific area. Students were encouraged to take responsibility for their learning and ask questions in class, or seek out additional resources such as tutoring. This emphasis on a shared responsibility was one of the implications of the new instructional design paradigm suggested by Reigeluth (1999). The literature supports the positive effect of interactions and engagement in the classroom, but little research can be found about whether including the specific intervention of an explicit questioning strategy using clickers leads to improved achievement in such courses. Fewer scholarly sources exist when the learner population is narrowed to community college students. This quasi-experimental research study will focused on a developmental mathematics course to help fill this literature void. Like any emerging technology, the teaching and learning strategies must be developed to implement the clickers questioning strategy into the classroom. According to the International Board of Standards for Training, Performance and Instruction (ibstpi, 2000), an important facet of the instructional designer’s job is to apply current research to the development of new instructional materials. The results of this study inform the design of developmental math courses to improve student achievement.

10

Definitions of Terms For the purpose of clarity and specificity, the following terms were used operationally in this study. Achievement Gap. Refers to the observed disparity on a number of educational measures between the performance of groups of students, especially groups defined by gender, ethnicity, and socioeconomic status. Clicker. Refers to an audience participation device which students enter answers to questions and the class results are displayed to the group. Developmental Courses. Course the student must take in order to bring up skills to a level needed to succeed in college level courses. “Easy-Hard-Hard.” Refers to a question sequence used at Ohio State where questions are presented in groups of three. The first question is an easy question which is designed to build the students’ confidence. It is followed by two more challenging questions. Explicit Instruction. Refers to an educational approach used in teaching that combined specific design components and systematic instruction. Non-equivalent group. The assignment to group is not random. As a result, the groups may be different prior to the study. Nontraditional student. Refers to a student who enters college, or returns to college later in life. Students in this category are typically 25-70. Passing rate. The ratio number of students who obtain a C or better in the course to the number of students originally enrolled in the course.

11

Quasi-experimental. A design which uses existing groups, not random assignment. “Rapid-Fire.” Refers to a type of sequence used at Ohio State University where the instructor presents homework review questions quickly at the beginning of the class session. Socioeconomic status. A family's socioeconomic status is based on family income parental education level, parental occupation, and social status in the community Traditional student. Refers to an 18-19 year old student who has just completed high school.

Assumptions and Limitations Assumptions For the purposes of this study, the following assumptions were made: 1. Individual variables had constancy over time and setting, were isolated and conceptualized as a variable, and formed an interval scale of numbers. 2. The teachers of the classes attempted to teach each class identically with the exception of a questioning strategy using clickers. The researcher had no contact with the experimental classes. 3. The dependent variable was student achievement as measured by postassessment. The post-test (book publisher test) was objective, consistently scored, reliable, and valid. A sample copy of a test is included in Appendix C.

12

4. The mathematical section of the placement test which all students took upon entering the college was valid and reliable. 5. For the data analysis it was assumed (a) the dependent variable is normally distributed, and (b) the variances are homogeneous. Limitations This study had the following limitations: 1. The results were generalized to developmental math students in a specific community college for a specific term. 2. Testing was done over the course of the term. Gains could have been influenced by familiarity with the instrument and not the dependent variable. Nature of the Study This research study was a quasi-experimental design that includes statistics of scores on the pretest, posttest, unit quizzes, and a survey. Because the matching cohorts could not be selected at random and the groups consisted of a control group and an experimental group, the quasi-experimental study was in the form of a nonequivalent control-group design. The Statistical Package for the Social Sciences (SPSS) software was used to compute descriptive statistics, t-scores and ANOVA. The results were used to conclude how explicit clicker strategies impact achievement of developmental mathematics students. The research design for this study was grounded in the well-established instructional design research framework described by Reigeluth (2009). A discussion of 13

the components of this framework will be presented in Chapter 2. This study was also based on three widely-accepted teaching and learning principles that have arisen from research in the fields of psychology, cognitive science, developmental education and instructional design. The principles are (a) active learning encourages student engagement, (b) formative assessment is particularly effective for students who have not done well in school in the past, and (c) explicit teaching is especially effective for teaching the basic math skills and problem solving for students who have difficulty learning mathematics. These principles were integrated with the current clicker technology research to determine to what extent an explicit clicker strategies impact the academic performance of developmental students in a pre-algebra class. Organization of the Remainder of the Study This study is divided into five chapters. Chapter 1 provides the introduction, background to the study along with the purpose and significance of the research. Chapter 2 presents a review of the literature relevant to the conceptual framework related to the study. Chapter 3 presents the research plan for this study. The instructional design research was grounded in Reigeluth’s framework, while the actual study was structured according to Krathwohl (1993). Chapter 3 includes the methodology and design of the study, sample of the study, data collecting procedures and data analysis. Chapter 4 presents the data and analysis. The results, conclusions and recommendations are presented in Chapter 5.

14

CHAPTER 2. LITERATURE REVIEW

Introduction The purpose of Chapter 2 is to present the conceptual framework of this study. Both the theory and practice of education are rapidly evolving as the country is experiencing the rapid changes technology, competition in the marketplace and the shortage of technical professionals. The result of these changes puts pressure on the areas of education and training as discussed Chapter 1. Research has identified ways to improve the structure of courses to create more effective learning environments. This chapter begins with a discussion of the research on the implementation of the clicker technology into the college classroom. The chapter then looks at modifying the strategy used in previous studies to address the needs of developmental mathematics students at a community college. Incorporated into this revised strategy for developmental students are three widely accepted teaching principles that have arisen from research in the fields of psychology, cognitive science, developmental education and instructional design. The principles are (a) active learning encourages student engagement, (b) formative assessment is particularly effective for students who have not done well in school in the past, and (c) explicit teaching is especially effective for teaching the basic math skills and problem solving for students who have difficulty learning mathematics. For each of the above principles, a review of current theories and practices will be discussed followed by 15

examples of recent clicker research. In addition, current research on strategies for teaching developmental courses has also been included because it influences the clicker strategy in the developmental mathematics classroom. Finally, all of the research is pulled together under the umbrella of the instructional design model of Gibbons and Rogers (2009) relating the research to the theory of this study. Current Clicker Research The Li (2007) dissertation was a quantitative study conducted in the large lecture physics classes of Ohio State University. The majority of the students were freshmen taking a first term calculus-based physics course. Students were expected to construct an understanding of the concepts through a series of lesson clicker questions which began with an easy problem and then moved to more difficult applications of the same concept. In addition, at the beginning of class, the students were presented with a daily quiz of rapid-fire questions which reviewed a number of previously taught concepts. The purpose of the study was threefold to determine; (1) whether using clickers helped students learn physics, (2) how using clickers helped students learn physics and (3) whether students perceive that clicker had a positive effect on their own learning process. The strategy for this project was based on comparing clicker lecture sections using the new methodology to lecture sections taught without clickers in a traditional manner. The results of Li project showed that during testing, clicker sections consistently scored higher than non-clicker sections both on common examination questions and on postquarter concept inventories. Results also showed that female students seemed to benefit more from the use of clicker than male students. In addition, results indicated that 16

students with a better academic level benefit both from the lesson sequences and rapidfire sequences; while students with lower academic level seem benefit only from rapidfire sequences. Finally, the end-of-quarter surveys indicate that students enjoyed using clickers, and believed that this tool helped them learn. These surveys also suggested that attaching credits to clicker responses and/or overusing clickers may lower student enthusiasm. The Campbell (2007) study was conducted in educational psychology classes with small sample sizes. The purpose was to determine if clickers helped students do better on both retention and transfer questions in the course. Both near and far transfer questions were examined as well as a number of other factors which arose as a result of the data collection. Pre and post tests and end of course survey were used and analyzed. Campbell’s class consisted of both upper classmen and freshmen. The upper class students showed less motivation and interest towards using the clickers than the freshmen as determined in the end of the course survey. Conflicting Results in Clicker Research The Campbell results contradicted the Li results in several areas. Campbell found that females did not benefit more from the use of clickers than the male students. In addition, Campbell found that students who were inherently interested in the material from the beginning of the term may not benefit, and the students who were not comfortable with this technology may actually be hindered. This result was not found by Li. Significant technical problems, such as unreliable transmission from the input devices, hampered the clicker class section during the Campbell project. This was not the 17

case in the Li dissertation. Campbell warned that clickers should not be implemented into the classroom just because they are novel or fun. Instead, there should be a specific educational need that the instructor is trying to meet. A plan should be developed to meet that need in order to see any benefit of this technology. Clickers Role in Active Engagement Wood (2004) found that clickers can be used as a strategy to increase active engagement in a number of different content areas. Draper and Brown (2004) showed that using clickers could improve classroom dynamics through a two-year, institutionwide, multi-departmental project in addition to being cost effective. They identified three important features of clickers: 1. Getting feedback to learners about whether they understand the material presented. 2. Getting most students to think about the question and decide on an answer. 3. Anonymity is often important in achieving these benefits. Cognitive research suggested that interest in a task may impact the emotional engagement in a task and the extent to which students engage in deeper processing (Schiefele, 1996, 1999; Schaw, 1998). According to Hidi (2000) the psychological state of a student who was interested in a subject exhibited more focus, increased cognitive functioning, persistence and effective involvement and a higher level of achievement. Printrick (2003) suggested that educators provide content material and tasks that are personally meaningful and interesting to the students and include some novelty and variety in tasks and activities. 18

Clickers as a Novelty Computers as novel stimuli were originally studied when introduced into the classroom (Berlyne, 1960). This increased student interest in the short run, but eventually lost their novelty over time. Berlyne stated that if the novel stimulus had motivational attributes other than novelty, such as increasing students’ self-efficacy, the initial high attention rate would fade, but students would still continue to pay attention. In a study in which computers were introduced into the classroom, it was noted that even when the novelty wore off, the students still tended to be engaged because presumably because it had characteristics other an novelty that continued to interest the students ( Bergin, Ford & Hess, 1993). According to Hiti (2000), novel tasks must be implemented with pedagogical principles in mind otherwise they might not have any effect on learning. Leppar and Malone (1987) described computer games and simulations which had been implemented into the classroom. The software seemed to encourage behavioral activity, but not much cognitive activity. This study focused on an instructional method that employed instructional technology as a way to introduce novelty into the classroom. Audience response systems have been implemented at many major universities such as MIT, Harvard, Yale, and the University of California, Santa Barbara (Mayer, 2008; West, 2005). Much of the research that has been published in peer-reviewed journals focused on student’s perceptions of the system rather than on learning outcomes (Beekes, 2006; Draper and Brown, 2004; Duncan, 2005; Hatch, Jensen & Moore, 2005; Latessa & Mouw, 2005, Wit, 2003). In a postgraduate accounting course at Lancaster University, Beekes (2006) reported that the students found the polling system easy to use and it had increased their 19

enjoyment of the lectures. Individual anonymous questionnaires provided the data. Wit (2003) studied the polling system in an undergraduate statistics course for psychology majors. Again through questionnaires, the author found that students had a positive perception of the device. The author also recorded an increase in class attendance from the previous year without the polling devices. Hatch, Jensen and Moore (2005) reported that the students in an environmental science course “strongly agreed” that the clickers were helping them to learn during class time. Summary of Current Clicker Research In summary, the majority of research on polling devices focused on the student’s perceptions of the system and how enjoyable and helpful they found the system. There has been little empirical research on the student achievement associated with the audience response system (Mayer, 2008; West, 2005). This represents a gap in the literature which was addressed by this study. Active Engagement Research The National Research Council’s Committee on Increasing High School Engagement and Motivation to Learn (2004) indicated that active engagement was required for learning and success in school. Engagement consisted of behaviors, emotions, social connections and cognitive strategies. The behaviors included attention, effort and persistence. The emotions were expressed as enthusiasm, pride and enjoyment. The social connections included teams and extracurricular activities. The cognitive strategies were responsible for both processing information and developing the learner’s self esteem. Current research indicated that students need to be actively engaged in the 20

learning process in order for meaning learning to occur (Mayer, 2008; Bransford, Browning and Cocking, 2000; Zimmerman, 2002). The purpose of including a review of active learning is to facilitate an understanding of the critical role of the student engagement in the learning process. Mayer (2008) described two kinds of active learning. The first type was behaviorally active. The students performed a series of active tasks even if the student was not cognitively engaged. The second type of active engagement, described by Mayer, was cognitive engagement, with or without behavioral engagement. During cognitive engagement the student not only processed the information, and remembered the strategies but also developed their beliefs about themselves as learners. Mayer demonstrated that achievement depended on the learner’s cognitive activity rather than behavioral activity learning Research on Active Learning in the Physics Classroom The Physics Education Research (PER) Group at Ohio State studied active learning in physics classes at the university since the early 1980s. During that time, the researchers found the students learn better by doing than by watching something being done (Redish, 1994). Another researcher, Gamson (1987), suggested that learning was not a spectator sport and students must make what they learn part of themselves. Bonwell (1996) summarized some of the major characteristics associated with active learning strategies in the context of the college physics classroom: 1. Students are involved in more than passive listening. 2. Students are engaged in activities including problem discussion. 21

3. Student motivation is increased (especially for adult learners). At the current time, many traditional introductory physics courses at the university level rely on “transmission-of-information” lectures, a technique that was neither highly active in class, nor effective in fostering conceptual understanding (Handelsman, 2004). Handelsman suggested that classes move away from such lectures and incorporate active learning strategies into the classroom. Lectures that involved active learning strategies placed a more explicit focus on problem-solving techniques and conceptual understanding than did most traditional classes simply by the nature of the engagement (McDermott, 1991; Redish, 1997). These physics classes improved learning and knowledge retention as measured by retention and transfer questions on a posttest. Summary of Active Learning Research A summary of the research indicated that active engagement was required for student learning (Mayer, 2008; Bransford, Browning and Cocking, 2000; Zimmerman, 2002). Engagement consisted of behaviors, emotions, social connections and cognitive strategies. Clickers provide the opportunity to be actively engaged in the learning process on a daily basis. Surveys from previous clicker studies emphasized the student enthusiasm and enjoyment from using the technology regularly in class.

Research on Formative Assessment Although some researchers (Li, 2007 and Campbell, 2007) credit the basis of formative assessment to Vygotsky (1930) and Piaget (1969), it had not been researched and promoted until the end of the 20th century. Black and William (1998) conducted an 22

extensive research review of 250 journal articles and book chapters to determine whether formative assessment raised the academic standards in the classroom. They concluded that classes with increased formative assessment produced significant learning gains. Black and William arrived at another important conclusion: that improved formative assessment helped low achievers more than other students and so reduced the range of achievement while raising overall achievement. Formative assessment should occur during the learning process and provide feedback to students and teachers. Clarke (2005) and other researchers argued that formative assessment is integral to effective teaching (Black & William, 1998; Heritage, 2007; NCTM, 2000). Therefore, educators must focus on all the students and use formative assessment to help all students meet the course competencies (Stiggins, 2006). Although students may not yet be proficient, they were moving towards that goal. During the learning process, formative assessment provided students with opportunities to show what they were able to do without being graded. Students were not penalized for their lack of understanding while they are still learning concepts. Clark (2005) suggested that students were more likely to be motivated to improve their learning when high stakes testing was removed. Stiggins (2006) stated that the lack of formative assessment found in classrooms was a detriment to low-performing students. Stiggins work also identified that overuse of summative assessments was associated with lower self-efficacy, less motivation, and higher dropout rates. Analysis of the data from the TIMSS 1995 study suggested that the lowest performing mathematics students were exposed to the most summative testing (Beaton, Mullis, Martin, Gonzalez, Kelly, & Smith, 1997).

23

Research on the Role of Feedback Sadler’s (1989) research indicated that feedback played an important role in formative assessment. It helped learners become aware of any gaps that existed between their goal and their current knowledge, understanding, or skill. In addition, the most helpful feedback on tests and homework provided specific suggestions for improvement. In addition, good formative assessment encouraged students to focus on the task rather than on simply getting the right answer (Bangert-Drowns, 1991). This type of feedback was particularly helpful to lower achieving students because it emphasized that a student can improve as a result of effort rather than be relegated to low achievement because of a presumed lack of innate ability (Fuchs & Fuchs, 1986). Although feedback generally originated from a teacher, the learner had an important role in formative assessment through self-evaluation. Fontana and Fernandes (1994) conducted experimental research studies and have found that students who understand the learning objectives and assessment criteria and have opportunities to reflect on their work show greater improvement than those who do not. Emberger (2002) indicated that it was not always an easy task to implementing feedback. That research identified a few key attributes for feedback in order to be more likely to produce the desired effect (Emberger). The feedback should be 1. Corrective in nature. Students need to understand what they were doing correctly and incorrectly. 2. Timely. In general, the greater the delay between assignment and feedback, the less improvement occurred.

24

3. Specific to the criteria. Feedback was most effective when it was specific to the criteria the teacher identified and described what the student did or did not learn. Formative Assessment in Mathematics The McManus (2008) study discussed the implementation of formative assessment in a mathematics classroom. Formative assessment in this study was defined as a process. Based on the results of this study, the researcher identified four steps that must exist in the process. Step 1. Identification of learning objectives and criteria for success. This identification was to be done by the teacher or the teacher and students together. Step 2. Elicitation of Evidence of Learning. It is important that students are taught something before the formative assessment is given. This step occurs after a teaching activity but during the learning process. Step 3. Recognition of a Gap in Understanding. It is important that teachers and students recognize when there was a gap in understanding the learning objective. Step 4. Implementation of Action(s) to Close the Gap. Without action, gaps in understanding were not closed and student learning was not impacted. Action was most often exhibited in the use of descriptive feedback to students. Another example, the Fuchs and Fuchs’ (1986) study, which was devoted to lowachieving students and students with learning disabilities, showed that frequent assessment feedback helps both groups enhance their learning.

25

Formative Assessment with Clickers Draper and Browne (2004) noted that clickers were useful tools to implement formative assessment in lectures, since instant feedback was provided to both instructors and students. Feedback played an important role in formative assessment. It helped learners become aware of any gaps that existed between their desired goal and their current knowledge, understanding, or skill, and guided them through actions necessary to obtain the goal. By providing polling results in real time, clickers were able to immediately help lecturers understand whether a concept had been understood. Clickers also provided feedback to the students for self-evaluation. Draper discussed one common strategy used in clicker classrooms, the re-vote. Whenever a large number of students selected the incorrect answer, the instructor asked the students to discuss their answer with a neighboring student and re-vote. Results on the re-vote were significantly better than the original vote.

Summary of Research on Formative Assessment The research indicated that formative assessment increased learning as long as it is corrective, timely and specific. A key factor in the formative assessment was the use of descriptive feedback to the students. This is one of the components which were missing from the Li (2007) rapid fire strategy. In that sequence, correct answers were identified, but alternate answers were not fully explained. So although students knew they were wrong, they may not have understood why.

26

Research on Explicit Teaching The National Council of Teachers of Mathematics (NCTM) has been at the forefront of promoting the reform movement and subsequent changes in instructional practices. The NCTM published four landmark documents: (a)”Curriculum and Evaluation Standards for School Mathematics” (NCTM, 1989), (b)”Professional Standards for Teaching Mathematics” (NCTM, 1991), (c)”Assessment Standards for Teaching Mathematics” (NCTM, 1995), and (d)”Principles and Standards for School Mathematics” (NCTM, 2000). Based on this work, national standards for mathematics have been established and almost every state has integrated these standards into their general curriculum (Woodward & Montague, 2002). The NCTM suggests that classroom time be devoted to helping students develop reasoning skills, learn problem solving strategies, and see connections between the various types of mathematics skills as well as connections between mathematics and other disciplines, such as science (Kulak, Rudnick, & Milou, 2003). The Need for Explicit Instruction As students enter college, the college administration expected that they came in with an understanding of what it means to an effective learner, but that was not usually the case (Rachal, Daigle, & Rachal, 2007). Instead, students needed explicit instruction on learning strategies to allow them to be successful in classes. Learning strategies referred to the methods students use to improve learning. These can be cognitive or behavioral techniques which enhance performance. Freshmen especially needed timely, accurate feedback regarding their ability to use their cognitive skills. An additional 27

finding from this study was that the situation did not improve as students progressed through their programs. Regardless of their academic classification, explicit instruction in learning strategies for reading, writing and mathematics was required by most students at every level. For more than a decade, most mathematics educators and researchers in the field of general education have supported reform-based mathematics that was based on constructivism (Troff, 2004). When used with higher achieving students, constructivism leads to progressively fewer errors during problem solving exercises and aids in solving real world problems (Woodward & Montague, 2002). However, many educators and researchers students with diverse learning needs continue to support more explicit teaching approaches (Gersten, Baker, & Marks, 1998; Mercer & Mercer, 2001). Jones, Wilson and Bhojwani (1997) suggested that students with difficulties learning mathematics adopt a constructivist “trial-and-error learning” in the classroom and are prone to make more errors than their able peers. This caused them to give up and withdraw from instruction. The National Research Council noted in Sheffield and Cruickshank (2005) that explicit instruction often acted as the foundation for students who have difficulties in mathematics. Explicit instruction was defined as an approach that combined specific design components and systematic instruction. The teacher’s role involved designing the lesson and arranging instructional variables to promote optimal learning. The five principles of explicit instruction included big ideas, conspicuous strategies, guided practice with scaffolding, strategic integration and judicious review. When planning a unit of study, the teacher identified big ideas, or important understandings, that are relevant to the upcoming lessons. The notion of big ideas came 28

from the belief that students should to be able to do a few things well rather than a lot of things poorly (Carnine, 1997). Miller, Harris, Strawser, Jones and Mercer (1998) suggested that the principle of big ideas be expanded to cover all of mathematical concepts. Once the understandings were identified, measurable objectives were established for the lesson and prerequisite knowledge was identified for review. Carmine suggested that these were the same steps which experts used when working towards similar goals. In addition, conspicuous strategies, like finding an easier related problem, were developed that will help all students solve the mathematics problems (Harniss, Carnine, Silbert, & Dixon, 2002). These strategies allowed students with mathematical difficulties to solve complex problems as efficiently as their higher achieving peers. Guided practice with scaffolding was critical for student support in the early stages of learning (Miller, 2002). Since lower performing students require more scaffolding than others, it must be built into the instructional program (Carnine, 1997). Yet it was ultimately up to the professor to act as the mediators in determining the level and degree of scaffolding necessary (Simmons & Kameenui, 1996). The professor ultimately provided the strategic intervention necessary to order the introduction of new skills to produce a general higher-order skill. Judicious review should evaluate that the student has been developing fluency in the subject, accumulating more skills and generalizing the topic (Mercer & Mercer, 2001). Examples of Explicit Instruction Most of the examples of explicit instruction in the mathematics classroom came from the special education literature. Explicit instruction in mathematics was teacher29

delivered and structured. There were four essential phases within the explicit teaching sequence which was covered in each class (Hudson, Miller & Butler, 2006). In phase 1, the lesson began with an advance organizer that prepared students for the upcoming lesson by bridging the gap between student’s prior knowledge and new learning (Miller, 2002; Williams & Butterfield, 1992). The teacher reviewed prerequisite knowledge students will need to succeed in the upcoming lesson. When students were proficient in prerequisite knowledge, the lesson objective was explicitly stated, and the teacher guided students to see the importance of learning the new mathematics content. Phase 2 of the teaching sequence was demonstration. The teacher modeled the overt actions as well as the meta-cognitive and cognitive thinking that took place while solving a problem. The teacher engaged students with questions and prompts. These questions kept students attentive and helped the teacher monitor student understanding of the content being presented. Importantly, monitoring of the student responses allowed the teacher to make adjustments and clarify misunderstandings. Once the students had been introduced to the problem, the Phase 3 of explicit teaching began. Guided practice involved providing students with opportunities to practice the new mathematics concept or skill. The teacher assisted and supported students as they began to apply what had been demonstrated. Initially, the teacher used a high level of support (e.g., verbally directed students through each step involved in solving the problem, and provided many prompts and cues) and gradually withdrew the amount of support as students became proficient with the new skill (Hudson, Butler & Miller, 2006). Students were highly engaged, and the teacher used student responses to monitor performance, and provided positive and

30

corrective feedback. Once students can complete problems accurately and without teacher assistance, the teacher moved to the independent practice, Phase 4 of instruction. A variety of formats can be used to facilitate independent practice, including peer tutoring, computer software programs, cooperative learning, and self-correcting materials. Varying the independent practice formats helped maintain student interest. In a recent meta-analysis, Kroesbergen and Van Luit (2003) evaluated the effectiveness of mathematical interventions for students with special needs. Specifically, they calculated the magnitude of the effect of the three types of mathematics methodology: explicit instruction, cognitive self-instruction (i.e., step-by-step strategies or procedures frequently taught using explicit/direct instruction), and mediated or assisted instruction (i.e., methods that involve students discovering and developing their own math skills, with the assistance of a teacher). They found that methodologies utilizing explicit/direct instruction were more effective for teaching basic math facts and problem solving to students with learning difficulties than mediated or assisted reform-based instruction. There are several variables to consider when planning mathematics instruction for diverse groups of students. It is helpful to use mathematics examples that are age appropriate, directly linked to students’ lives and appear to have direct application to current needs and interests. Advance organizers are used in explicit teaching to help students organize their thinking and focus on the upcoming lesson. Throughout the lesson it is appropriate to include a discussion that helped students understand and appreciate the direct application of mathematics to their areas of interest. A second commonality among all levels of students is the need for appropriate level of challenge (i.e., not too easy, not too difficult). In explicit teaching, questioning and discussion were an integral part of the 31

instructional process. The notion of guided practice typically used in explicit teaching can be adapted to meet the needs of students with various ability levels. Varying the type and level of questioning, along with the amount of support provided to students, helped individualize the material. A third commonality among students is the need for adequate practice to ensure mastery and generalization of the mathematics content. Progressing through the mathematics curriculum before concepts are mastered is particularly problematic due to the hierarchical nature of the discipline. Concepts and skills build on one another. Thus, in addition to negatively effecting performance in the current unit or lesson, gaps in knowledge also negatively affect performance in subsequent units and lessons. Part of the planning process needs to account for the fact that students require different amounts of practice to reach mastery. Supplemental projects to extend the learning of high achievers are needed to allow for the additional time that lower achievers will need to master critical concepts before moving on in the traditional curriculum. All three of these guiding principles for planning and delivering math instruction (i.e., integrate student interests, provide appropriate level of challenge, ensure mastery before progressing in the curriculum) are important. By combining explicit teaching which accommodated the needs of higher achievers and adding components which reinforced the needs of weak students, all students in the general education classroom can benefit. Background Information on Clickers Clickers are grouped into three major types based on the hardware used. First there are infrared (IR) devices. These clickers use infrared light to transmit signals. They 32

were often referred to line of sight clickers because there needed to be a direct line from the receiver to the transmitter in order for the response to be received. Their range was only 50 feet so they were only useful in small classrooms, not college lecture halls. IR clickers were less expensive than other types of clickers. Because of their cost and range, the first market for the IR clickers was the public school system. The second type of clickers were called Radio frequency (RF). RF clickers use radio frequency waves to transfer signals. The major advantage was that students did not have to point their device at the receiver in order to record the vote and the range was up to 200 feet (Li, 2007). In addition, the students were connected with a two-way communications system that immediately told the students that their response was received. This particular feature was not initially available on the IR models. For some systems, the radio frequency interfered with cell phone and WiFi signal. However, the RF signals generated by Turning Point® clickers did not interfere with either signal. This is a problem that is being currently addressed and remedied. The third type of clicker was known as a Virtual clicker. This was a software package that the client installed on WiFi laptop/PDA. The software allowed the student to communicate with the instructor’s computer. The Virtual clickers did not need receivers; their working range was limited by the range of the WiFi signal. Interference was eliminated by assigning an IP number to each laptop or PDA (Li, 2007). Burnstein, (2006) stated that they were particularly useful in small class rooms, because the instructor was able to show text answers from students. As schools increasing install wireless networks and students carry their laptops to class Virtual clickers may become the favored technology. 33

Clicker Software Platforms There were a number of different software platforms for the clicker technology. Although some companies (such as I-clicker) used their own software, an increasing number of companies) used PowerPoint® integrated software. Using a familiar program made the software more user-friendly to the instructor. A number of programs collected the student responses which could be recorded and view in an Excel. There were many aspects to be considered when choosing clickers. Among those were the difference in the technologies, the unit costs, software costs, software ease of use, and customer service. The college ran a field test with Turning Point® devices in a number of classrooms at the college in 2006. After the field test, the college decided on RF over IR clickers because of line of sight difficulties, even in small classrooms. The college selected a commercially-available RF clicker system called Turning Point® as shown in Figure 1. Each key pad costs about $35. Turning Point® coupled their response software with Microsoft ®, which was user-friendly to the instructors. Software was downloaded from the Turning Point® website, and there was no additional charge for updates. Each RF receiver costs approximately $90. One classroom typically needed one receiver and 32 transmitters. These came packaged in a carrying case and were stored in the lecture desk for the instructor’s use.

34

Figure 1. Turning Point ® RF clicker and receiver. Note: The RF clicker is a product of Turning Technologies, LLC. Retrieved from http://www.turningtechnologies.com/studentresponsesystems/studentclickers/. Turning Point® (TP) offered the instructor a way to analyze individual responses. To do this, the instructor needed to set up a participants list. In TP, the instructor first clicked on to participants’ list wizard to create your new participants list. Then the instructor went to edit a participants list, input student name and device ID. The device ID was the serial number at the back of the clicker hand pad. Each time the instructor began a session, the instructor chose auto if no participant responses were collected, or a participants list, created for the individual class. After the session, the file is saved. To view the data, click open session on the TP tools bar, and open the session which was saved. Then go to tools → turning report, choose the turning report desired. The most common choices have been results by questions to see summary response information by questions or results by participants to get individual responses by students (Turning Technologies, 2008). Integrating the Research into Instructional Design The supporting research becomes synthesized in the seven layers of instructional design described by Gibbons in Reigeluth and Carr-Chellman (2009). These layers were 35

arranged into a hierarchy pyramid to illustrate the design and guide the reader through the structure and conceptual framework of this study. Figure 2 below represents the Gibbons and Rogers “layers of design” model (2009). The layers are described below in general terms. Each layer was then expanded to synthesize research presented in this chapter into a conceptual framework for this study based on the design.

Figure 2. The instructional design pyramid Adapted from “The architecture of instructional theory” by Gibbons and Rogers, 2009. In C.M. Reigeluth & A.A. Carr-Chellman (Eds.) Instructional design theories and models (Volume III): Building a common knowledge base. (pp. 305-326). Mahwah, NJ: Lawrence Erlbaum Associates.

36

Beginning at the base of the pyramid, the layers of the Gibbons and Rogers (2009) Instructional Design are listed below. 1. Content layer: The content layer provides the basis of instruction and is utilized by each of the other layers. A pyramid was chosen for this adaption because it emphasizes the content as the course foundation. 2. Strategy layer: This layer must contain the patterns of interaction between the learner and the instructional experience. 3. Message layer: The message layer specifies how the content can be communicated to the learner in conversational form. 4. Control layer: This layer specifies how the learner interacts with the source of the instruction. 5. Representation layer: The representation layer specifies the form and composition of the material presented. It deals with coordinated delivery and synchronization of messages. 6. Media logic layer: this layer deals with the mechanisms which cause the presentations to occur. 7. Data management layer: This is the layer which addresses the method that data is captured, analyzed, interpreted and reported. This is where there was a modification to the Gibbons and Rogers design (2009). Gibbons and Rogers placed data as the last layer in the design. In this adapted version of the design the data layer was adjusted to include data collection, analysis and interpretation at every level. Based on the data collected during the course, in the form of formative assessment, the course was modified to meet the needs of the students. The use 37

of data and evaluation at each step is a component of other instructional models including the Morrison, Ross and Kemp Model (2005). Content Layer For this study, the content selected was a pre-algebra. Pre-algebra, at most colleges, consists of an arithmetic review and beginning algebra skills. The content covers topics from simple topics in arithmetic like addition, subtraction, multiplication and division of integers to similar operations with fractions, mixed numbers, decimals and polynomials. The course outline was determined by college departments and evaluated periodically. Students were placed into this course as a result of the college placement test. The significance of this layer to the study is in both the content area and the target population. The research has demonstrated that there are a limited number of studies in the area of developmental mathematics. By focusing on this population of students, the study contributed to the base of knowledge on the developmental students at the community college level (U.S. Department of Education, 2008). Strategy Layer The strategy layer for this study was based on active engagement in the classroom. The research suggests that students incorporate cooperative learning techniques and discuss the answers with a partner, but answer individually to increase conceptual understanding. Draper and Brown (2004) showed that using clickers could improve classroom dynamics and raise a student’s confidence level. The literature suggests that a questioning strategy which incorporates clickers into the classroom can increase active engagement. Research also states that active engagement has been linked 38

to increases in achievement. By integrating active engagement techniques and a questioning strategy using clickers, this study addressed an area where there is currently a research gap. Message Layer When considering how to deliver the message, learning style, and content impact the strategy. Merrill (2000) suggested that “instructional strategies should first be determined on the basis of the type of content to be taught (the content-by-strategy interactions) and secondarily, learner styles and preferences are then used to adjust or fine-tune these fundamental learning strategies.”(p. 44). One of the instructional strategies implemented in this study was derived from research on students with diverse learning needs. Students experiencing difficulties in math who are presented with explicit instruction learn the fundamentals more rapidly and are more successful than students who are taught using a constructivist strategy (Gersten, Baker, & Marks, 1998; Jones, Wilson & Bhojwani, 1997). The explicit instruction was included to make sure that students understood the steps necessary to perform various mathematical operations. There was additional explicit instruction in problem solving strategies as suggested by the National Council of Teachers of Mathematics (2008). Another instructional strategy which was be implemented in this study was the use of a questioning technique similar to the one Li (2007) investigated with the physics students at Ohio State. The results of Li project showed that students in the sections which used the clicker questions consistently scored higher than non-clicker sections both on common examination questions and on post-quarter concept inventories. In addition, 39

results indicated that students with a better academic level benefit both from the lesson sequences and rapid-fire sequences; while students with lower academic level seem benefit only from rapid-fire sequences. The literature suggests that after explicit instruction, all students would be in a better position to benefit from the lesson sequence questions. Synthesizing the research from both explicit instruction and clicker research provided a model for a questioning strategy in the study. There is currently a gap in the research literature on explicit questioning techniques using clickers in developmental mathematics classes which this study attempted to address. Control Layer This layer specifies how the learner interacts with the source of the instruction. In addition to the usual classroom method for asking a question like raising a hand, the student interacted with the instruction directly with the clicker. The clicker provided an anonymous way for the student to respond to the material. Campbell (2007) suggested that in order to see any benefit of this technology, clickers should be implemented into the classroom for a specific educational need that the instructor is trying to meet. The specific need in this study was to increase mathematical achievement of the underprepared students. The audience polling devices allowed the instructional strategy of questioning to be controlled effectively in the classroom. There is a gap in the research literature indicting how the questioning strategy using clickers will control the flow of information in the classroom.

40

Representation Layer The representation layer specifies the form and composition of the material presented. It deals with coordinated delivery and synchronization of messages. In this study, both PowerPoint® presentations were incorporated into the course. PowerPoint® presentations are frequently used in most of the math classes at the college. The research suggested that explicit instruction and questions are most effective for teaching the developmental student. These will be the focus of the PowerPoint® presentations. Media Logic Layer This layer deals with the mechanisms which cause the presentations to occur. The study was conducted in a room with a computer, a screen or white board, and a projector. The computer needed to be able to run the audience response software and the Microsoft Office PowerPoint® program. A set of polling devices must be available to the students and a receiver must be attached to the USB port of the computer. The program could have also been implemented on a portable media cart. Data Management Layer The research on formative assessment suggested by Clark (2005) and Stiggins (2006) provides convincing evidence of the value of such assessment for students with difficulties in a content area. Formative assessment occurs with every click as students can get instant feedback on the correctness of their clicker responses. Emberger (2002) indicated that to be effective, feedback should be: corrective in nature, timely, and specific. The use of the clickers meets all of those requirements. Research including the work of Clark (2005) suggested that students were more likely to be motivated to 41

improve their learning when high stakes testing was removed. As a result of that research, the testing protocol consisted of a number of smaller quizzes. Summary In summary, since so many of today’s college students lack learning strategies, an explicit instruction approach can be an effective way of teaching developmental students in mathematics. By incorporating techniques initially developed for students with learning disabilities, all students will be able to master the critical concepts required for a foundation in mathematics. The literature review has examined the role active engagement, formative assessment, and explicit instruction have on learning. When the technology of clickers is added to this classroom mix, there is a gap in the research literature. The majority of the studies have focused on class engagement. The results of these class engagement studies, which came from a number of different fields in both community college and university studies, have been mostly positive. The current studies on student achievement using a clicker strategy presented opposite and inconclusive findings. There are no studies which examine the impact of a clicker questioning strategy with developmental math students in a community college level. There is also a gap in the literature on the use of clickers with explicit instruction. The Gibbons and Rogers design (2009) provided a foundation to synthesize the research and the instructional design theory to answer the question which is missing from the literature: To what extent does the use of clickers as an instructional strategy impact students’ level of achievement of pre-algebra skills? Chapter 3 presents

42

the research methodology and procedures which were used in the study, the design of the study, the sample of the study, the data collecting procedures and the data analysis.

43

CHAPTER 3 METHODOLOGY Purpose of the Study The purpose of this study was to investigate the effect of an explicit instructional strategy using the clicker technology on academic achievement. After a thorough examination of the literature, a gap was identified in the literature which determines to what extent explicit clicker strategies impact the academic performance of developmental students in a pre-algebra class. This study was based on three widely accepted principles: 1. Active learning encourages student engagement; 2. Formative assessment is particularly effective for students who have not done well in school in the past, and 3. Explicit teaching is especially effective for teaching the basic math skills and problem solving for students who have difficulty learning mathematics These principles have arisen from research in the fields of psychology, cognitive science, instructional design and developmental education. Jonassen and Reeves (1996) suggested that these principles can be effectively combined when designing effective instructional design strategies. These principles were integrated with the current clicker technology research to determine to what extent explicit clicker strategies impact the academic performance of developmental students in a pre-algebra class. The research design for this study was

44

grounded in Reigeluth (2001) who noted that instructional design research frameworks should specify three things: 1. Instructional methods: which will help student achieve different learning goals, 2. Instructional conditions: which will influence when and when not to use each method to help attain a given goal. 3. Instructional outcomes: This will provide the measure of the value of alternate methods. (Lewis, 2002). The theory and practice of the discipline of instructional design also suggested that in order to implement a new instructional design, based on a different theory of learning, it is usually necessary to modify not only one, but maybe all or most of the components of a lesson (Lewis, 2002, Reigeluth, 1983). Research Question This study examined the effect of using explicit instructional strategies with clickers in learning basic pre-algebra skills in a developmental mathematics course. There was one research question. The research question asked to what extent did the use of clickers as an instructional strategy impact students’ level of achievement of pre-algebra skills. This question was selected to help address the problem of student achievement in pre-algebra classes. The null hypothesis for this question was: H1: There will be no difference between the academic achievement of students who use clickers as an instructional strategy or the control group.

45

Research Design The framework for this study was grounded in a well-established instructional design framework of Reigeluth (1983). There were three major components in this design: 1. Instructional methods: these allow the learners to achieve different outcomes/goals. 2. Instructional conditions: these influence when each method would be used. 3. Instructional outcomes: these goals provide a measure of the value of alternative methods under different conditions (Lewis 2002, Reigeluth, 1983). In this research design, the course presentation was designed to include an explicit questioning strategy. The questions were embedded in the presentation content for the experimental group. The homework, assessment and evaluation procedures were the same for both groups. In educational settings, the environment often prevents the formation of random groups (Creswell, 2005). Because the school’s cooperation was necessary for the implementation of the experimental treatment, the researcher was not able to randomly assign students to groups. This study was a quasi-experimental study since intact groups were used. Mertens (2005) defined two types of quasi-experimental designs: the static group comparison design and the nonequivalent control group design. The two designs were similar, except that the static group comparison design did not use a pretest. Since the researcher planned to use a pre-test, the design for this study was a modified nonequivalent control group design. This study incorporated Krathwohl‘s (2004) A-B-AB variations between the pre and post test similar to a method used by Lewis (2002). This design allowed the researcher to collect data for the groups over time. “A” represented 46

the clicker strategy and “B” represented a quiz. “O” represented a Pre and Post test. The sequence for the experimental group was OABABABABABABABABO, where the first O represented the pretest and the last O represented the posttest. This sequence included one pretest, eight instructional experiences using the questioning clicker strategy, eight quizzes and one posttest. The sequence for the control group included O-B-B-B-B-B-BB-B-O. This sequence included one pretest, eight instructional experiences without the clicker strategy, eight quizzes and one posttest. The Sample To recruit participants, the researcher sent out an official e-mail “request to participate” to all of the instructors of the course. It was anticipated that four instructors will be selected to participate from the possible number of ten (or more) instructors at the college. If four instructors did not volunteer in a single term, several things could have occurred. First, the study would have been continued for another term with the instructors who were willing to participate or alternately, the instructors who volunteered could have their schedules changed to accommodate another section, if they wanted that. The researcher provided each selected instructor with a course guide that included the lessons, the examples, the in class work, the homework and the quizzes. Each unit of the course presented the explicit instruction part of the lesson with the same PowerPoint® presentation. The only difference between the sections should be the questioning strategy using the clickers. The researcher discussed the study with each instructor and was available for assistance, classroom observations and guidance throughout the study. The researcher has no personal contact with the participants of the study during the term. All 47

of these instructors have previously taught this course and are familiar with the textbook and support materials. Participants in this study consisted of four sections of pre-algebra students in a branch campus of a community college in Pennsylvania. The students were identified at the lowest level of math by AccuPlacer, the mathematics placement exam used by the college. The classes were a blend of both traditional and nontraditional ranges in ages from 18 – 54. Two sections were designated as the control group. Two sections were designated as the experimental group. The experimental and control groups were selected by pulling a letter E, for experimental or C for control groups out of a hat prior to the start of the course. The selection of the sample size and the type of analysis can increase the study’s significance (Krathwohl, 1993). The Course The course which was studied was a 3-credit pre-algebra class which met during one term for forty-five hours, two classes per week. Students were placed in the course because they scored poorly in the math placement exam. This course was a prerequisite for the second developmental course entitled Introductory Algebra. There were approximately 15 sections of the pre-algebra course taught by at least 10 different instructors during each term. The course syllabus was included in Appendix A, along with the list of online homework in the Course Compass program. The textbook for this course was Pre-algebra from the Mathematics Series for Higher Education from Pearson/Addison Wesley (Bittenger, Ellenbogen, & Johnson, 2008). All of the instructors were familiar with PowerPoint®, Blackboard and Course Compass. In addition to the

48

textbook, both groups had access to PowerPoint® lessons and a homework program which accompanies the textbook. General Format of the Class Both groups began class with a review of the homework. Homework was provided by the publisher on a system called Course Compass®. Homework was assigned for every class session. Course Compass allowed the student to practice and get help with the homework online, on a personal need basic. The instructor asked if there were any questions on the homework. Homework questions could have been brought to class for additional discussion. After the homework review, a lesson was presented via PowerPoint®. The lessons covered the sections of the textbook assigned for that class. The lessons were presented in an explicit format. This meant that math skills were broken down into smaller steps and students were shown how to set up and solve the problem with an example. This continued with several different types of problems each class. During the lesson, the students were then given several similar problems which reinforce and extend the concepts. The teacher and students worked on the problems together. The session ended with a slide which presented representative problems from the book reinforcing the new concepts. For approximately the last ten minutes, the students began working on the class work of the day. During this time the teacher was available to help individual students. Experimental Group In the experimental group, the students will signed in and picked up their individually numbered clicker as they entered the classroom. At the start of each class, 49

the teacher showed an interactive PowerPoint® presentation to review the homework. There were usually 12-15 problems on the presentation taken from the previous class. These were referred to as rapid fire questions. Problems were presented one at a time. Students sat at their desks with pencil and paper and worked out the correct answer, then clicked in on the individual clicker. Students were encouraged to discuss questions with their partner, and students were able to change an answer if they felt they made a mistake before the results were made visible. After a reasonable period of time, the teacher showed the graphic of the student responses and discussed each problem thoroughly on the board illustrating the step-by-step process used to solve the problem. A problem illustrating a difficult concept was often followed by a second similar problem so students can practice the technique and verify that they understand by selecting the correct answer. The review session typically took 30 minutes of the class. At this time, the teacher asked if there were any additional homework problems which needed to be discussed. The review was then followed by the lesson and the introduction of new material. At this time, students were led through solutions in a step by step process of explicit instruction. During the next 30 minutes, new materials were presented via PowerPoint® and on the board. After each example of new material was presented, the students were given clicker problems to reinforce the new skill. Usually these questions were presented in groups or two or three questions in an Easy-Hard or Easy-Hard-Hard format. The session ended with a slide which presented representative problems from the book reinforcing the new concepts. For approximately the last ten minutes, the students worked

50

on the class work of the day. During this time, the teacher was available to help individual students and collect the clickers. Control and Experimental Comparison In the control group, there was no modification to the instructional design. Both instructors used a whiteboard to solve examples and demonstrate procedures. Both sessions ended with a slide which presented representative problems from the book reinforcing the new concepts. For approximately the last ten minutes, the students began working on the class work of the day. During this time the teacher was available to help individual students in both classes. Both the experimental and the control groups will have the same lesson objectives, the same examples, the same practice problems and the homework. To insure lesson uniformity, all lessons were presented as PowerPoint® presentations. The experimental classes’ lessons had embedded questions were answered with the clickers, while the control groups’ lessons had no embedded questions in the PowerPoint®. There were approximately three lessons per unit of study. After each unit, all students were given a unit quiz. All students were given a posttest at the end of the course. The lesson plans for the course were included in Appendix B. The experimental class used the questioning strategy with clickers for the review and throughout the lesson. A typical lesson offered the student about 30 opportunities to respond to a clicker question. The control class did not use the questioning strategy with the clickers. In both classes, students were free to ask questions during the presentation. The lesson plans

51

identified the review topics, the lesson information and the in class problems which were used in both classes. The content of both classes should have been the same. Tests Pretests and posttests were used as a measure for comparison of the learning effectiveness of groups (Lewis, 2002). These tests were not used to compare one student to another student. The pre/post test was a 64 question multiple-choice test. This pre/post was originally created from the multiple-choice data bank that accompanied an older version of the current textbook, Bittenger Pre-algebra Mathematics Series for Higher Education from Pearson/Addison Wesley (Pearson, 2008). However, Lewis (2002) stated that it could be inferred that the questions did have general acceptance by expert teachers of the subject as a valid instrument by which to measure the learning of the course material based on a history of successful use. This pretest was used as a departmental exam at a community college. The exam had been administered since 2003 as a prealgebra review in over 70 of the college pre-algebra sections. This exam indicated the level of students’ prior knowledge of algebra concepts. It also identified students who were misplaced in the pre-algebra level. Student who scored above 75% in this pretest were immediately placed in a higher level class, so there was incentive for a student to do their best. This exam was selected as the pre/post test because it had been administered over a five year period. The pretest will be used to verify the extent to which the control group and the experimental group were similar in their entry level knowledge (Lewis, 2002).

52

All students in all pre-algebra sections took a pretest during the first week of class. The pretest was made up of eight questions from each of the required eight units of study. Students were discouraged from guessing by have a choice E – Unable to Solve, in the multiple choice section. The students had 75 minutes to complete the exam. They were allowed to use a calculator. The answer sheets were machine scored. The students received the score on the test but were not allowed to view or keep the test. The pretest was listed in Appendix C. Students who scored over 75% in this test were moved to a higher level of developmental math. They were part of this study. The posttest was given at the end of the course. It contained the same 64 questions as the pretest. The exam was machine scored. Students were allowed to use a calculator during the exam. The posttest was used to determine what extent the student understood the math content at the end of the course. This test measured the end of the course achievement. It is important to note that the statistical comparisons of different groups undergoing different treatments were based on the one version of the posttest, which was taken by each participant (Lewis, 2002).

Quizzes

At the end of each unit, a unit quiz was given. These quizzes were given at the end of a class session, allowing the student a full 20 minutes to complete the quiz. The quiz consisted of eight multiple choice questions from the publisher’s databank. These questions tested the same concepts which were on the pre/post test.

53

The eight unit objectives were aligned with the course objectives and the textbook sequence. These included 1. Students will be able to solve exercises and word problems using addition, subtraction, multiplication and division problems with integers and whole numbers. 2. Students will be able to use Order of Operations and solve simple algebra expressions. 3. Students will be able to solve exercises and word problems using addition, subtraction, multiplication and division problems with fractions. 4. Students will be able to find the least common multiple and solve exercises and word problems using mixed numbers. 5. Students will be able to solve exercises and word problems using addition, subtraction, multiplication and division problems with decimals. 6. Students will be able to solve exercises and word problems such as rate, unit prices and geometric proportions using ratios and proportions. 7. Students will be able to solve exercises and word problems, including interest and sales tax, using percent notation. 8. Students will be able to solve exercises and word problems involving geometric shapes which include area, perimeter, and right triangle applications.

54

Data Collection The method chosen for this study was a pretest/posttest design described by Krathwohl as a non-equivalent group research design (1993). Following this design, the researcher collected the quantitative data from the pre-algebra course. These included: 1. Pretest scores. 2. Quiz scores. 3. Posttest scores. 4. Posttest scores divided into relevant chapter scores. At the end of the term, the instructor transmitted the data to the researcher with no student identifiers attached in the form of an Excel spreadsheet. The researcher had no direct contact or interaction with the subjects during the study. Data Analysis Procedures All quantitative data was analyzed using SPSS, Version 15.0 for Windows. Descriptive statistics were used for central tendencies on the pretest, posttest, posttest and retention. Since the research question calls for an examination of the possibility of an impact, pretest, quiz and posttest results of participating groups were compared within and between groups. The quizzes and posttest data were analyzed by using a repeatedmeasures of analysis of variance (ANOVA) and t-tests. One important reason for using ANOVA methods rather than just multiple two-group studies analyzed via t tests was that the former method is more efficient, and with fewer observations more information can be gained (Krathwohl, 1993). Each quiz represented one unit in the course. There were 55

eight questions on the final exam which relate to each unit of the course. Eight units with eight questions each provide 64 questions for the final exam. The performance on each quiz was related to the performance on the corresponding section of the posttest. Statistical significance, based on an alpha level of p ≤ .05, was used in order to determine: 1.

Whether or not the groups changed significantly from the pretest to the posttest (within groups).

2. Whether or not there was an overall significant difference in scores between the experimental and control groups (between groups) on both quizzes and posttest. 3.

Whether or not there was a significant difference between the two groups with regard to the amount of change in scores from the pretest to the posttest (group by time interaction).

4. Whether or not the there was a significant difference between the two groups with regard to the amount of change in scores from the quiz to the posttest (group by time interaction). The data will be displayed in the data analysis section of chapter 4. Threats to Validity The study was designed with internal and external validity safeguards. Threats to internal validity may occur due to the participant experience or experimental treatment procedures (Creswell, 2005). The two main threats to this design are (a) differential selection, because the groups might differ initially on an important characteristic and (b) 56

experimental morality (Krathwohl, 1993). Since the students in this study scored similar grades in the math placement exam, the differential selection threat should be minimized. To minimize the second threat, students who withdraw from the course were eliminated from the achievement study. In many colleges, up to 30% of the students fail to complete this developmental course. (Achieving the Dream Data, 2008). In addition, Creswell (2005) warned that resentful demoralization was a threat to internal validity which related to the treatment used in a study. To remedy this, all instructors were presenting the same lessons with the same information and the same examples. To eliminate the testing threat, there were different items used on the pre/post test and the chapter quizzes. Lewis (2002) noted that the use of different tests for the quizzes and the post test was desirable to eliminate the possibility of remembering or copying the test. In addition, the pretest and posttest are separated by thirteen weeks.

Content Validity The Addison Wesley questions and textbook were aligned with the scope and sequence of the college’s math department curriculum. Content validity was established by matching the unit objectives to the test items (see Table 1). The tests were appropriate because they follow the lesson objectives and format of the textbook and were based on the skills necessary to be successful in the next mathematic course.

57

Table 1. Verification of Content Validity ______________________________________________________________________ Unit Objectives Pre and Posttest items _______________________________________________________________________ Students will be able to solve exercises and word problems using Addition, subtraction, multiplication and division problems with integers and whole numbers

Items 1-8

Order of Operations and solve simple algebra expressions

Items 9 - 16

Addition, subtraction, multiplication and division problems with fractions

Items 17-24

Least common multiple and mixed numbers

Items 25- 32

Addition, subtraction, multiplication and division problems with decimals

Items 33 – 40

Ratio and proportions

Items 41 – 48

Percentage notation

Items 49 - 56

Area, perimeter, and right triangle application

Items 57 - 64

External Validity Threats to the external validity are problems which threaten the researcher’s ability to make generalizations and draw accurate inferences from the sample data to other persons or settings (Creswell, 2005). To minimize the interaction of selection and treatment threat, the researcher followed the following measures: 1. All students were treated the same during the study with respect to homework assignments, quizzes, tests, grading, and course expectations. 58

2. The study was a blind study to eliminate the Hawthorne effect. At no time were the students informed that the posttest grade was a part of a study. 3. The author did not have access to the students in the course. This eliminated any possibility of researcher bias influencing the participant’s grades or perceptions of the strategy implemented. Ethical Considerations In order to perform this research at the community college, the researcher received permission from the institution. The institution did not have an IRB, but relied on the researcher’s academic institution to provide an approved IRB. Prior to the start of the study, the researcher submitted the proposal and IRB application to the Capella University for approval to conduct the study. Because of the type of study, the IRB received an expedited review. After the review was received from Capella University, and approved by the institution used in the study, the research spoke to each section of students.   The Informed Consent The researcher prepared a presentation for each section on the first day of class. During that class, the students were told that their class has been selected to be part of a college-wide study to determine the effectiveness of various instructional strategies for students in developmental math courses. A copy of the letter was read to the class. Letters of informed consent were distributed to the class. Any questions were answered by the researcher. If the student agreed to participate, the student signed and handed in the informed consent letter. At that time the student was given a duplicate copy of the letter 59

for his/her files. Students who refused to sign the informed consent were not included in the study. The instructor kept a list of all students who had agreed to participate in the study. If a student later changed his/her mind and decided not to participate, the student’s name will be removed from the teacher’s list. In no way was a student’s grade influenced by participation in this study. All sections took a pretest at the beginning of the course. The pretest provided a baseline measurement of a series of skills prior to the treatment. The class instructor forwarded only those scored pretests of the students who wanted to participate in the study to the researcher. Participant’s data was kept confidential at all times. The names of the school, teachers and students were withheld from the study. Participants used an identification number on the research instruments. Instruments were kept securely in the researcher’s locked file cabinet. The final step to assure an ethical study was to perform the study in an ethical manner that did not harm the participants in the study.

Summary

Chapter 3 described the research design for the study to determine the effect of an explicit questioning strategy using clickers on achievement in a developmental math class at a community college. The design was firmly grounded in the Instruction Design framework advocated by Reigeluth (1983). A quasi-experimental study in the form of a nonequivalent control-group design used for this study was based on the methodology of Krathwohl (1993) and Creswell (2003, 2005). The data collected included a pre/post test, and eight quizzes. Additional data was supplied by the college on the students’ gender, 60

ethnic background, and full or part-time status. Chapter 4 presents the data collected in this study. Chapter 5 discusses the findings and makes recommendations based on the analysis.

61

CHAPTER 4. DATA COLLECTION AND ANALYSIS Introduction The purpose of this chapter is to present the statistical results of the data analysis. This study assessed the achievement of students enrolled in a remedial level pre-algebra course at a community college. One hundred twenty students began the course in four sections. One hundred thirteen students finished the course. Seven students either dropped out of the course or took an incomplete in the course. In this chapter, the experimental group is referred to as Group 1 and the control group is identified by Group 2. Chapter 4 is divided into the following sections: The first section provides a description of the demographic data collected in the study. The second section of the chapter provides the descriptive statistics of the data. The third section presents all of the data analysis results. The fourth section presents the themes that have emerged from the analysis. The quiz and test data was collected from computer scored answer sheets. The course instructors rechecked the scoring and no errors were found. The percentage of correct responses for the pretest, eight quizzes and posttest was determined. In addition, the posttest was subdivided into eight units and an individual unit score for each of the eight units was also recorded. The data file for this analysis consisted of the eighteen entries for the scored data and demographic data of gender, age, full-time/ part-time status, and ethnic group. 62

Demographics The students enrolled in the study represented a diverse group in terms of age, gender and ethnicity. As shown in Table 2, the students in this study were predominately full time, female students aged 21 and under. Caucasian represented the largest ethnic group with the number of Hispanics and Blacks with approximately equally distributed. In some areas, such as gender and full-time/part-time status, these demographics reflected the demographics of the student population at the college. In the areas of age distribution and ethnic background, the data departed from the college population. It is important to note that the developmental students often have different demographic characteristics from the regular college population with higher percentages of minorities populating these classes (ATD, 2007). Table 2. Demographic of Study Participants and All Enrolled Students ___________________________________________________________ Variable Percentage of Percentage of Participants Enrolled Students ___________________________________________________________ Male 33.6 38.5 Female 66.4 61.5 Age < 22 61.9 49.1 Age 22-35 10.6 20.3 Full-Time Student 59.3 59.0 Part-Time Student 40.7 41.0 Caucasian 44.5 71.3 African American 23.9 8.8 Hispanic 24.8 11.7 Other or Unknown 8.8 8.2 ______________________________________________________________________

63

Descriptive Statistics

The pretest scores were low for both the experimental group, Group 1, and the control group, Group 2.The minimum score on the pretest was 12.50 and the maximum score was 73.44 out of 100 points. The summary of data is presented in Table 3 below. For both groups measures of central tendency were comparable. This indicated that the two groups were equivalent in terms of mathematical ability at the start of the study.

Table 3. Pretest/Posttest Summary Data: Experimental (n=58) and Control Groups (n=55)

Pretest Posttest Experimental/Control Experimental/Control ________________________________________________________________________ Mean

39

38.3

73.4

67.2

Median

38.5

38.4

75

71.8

Range

37.5 - 73.44

37.5 - 68.75

43.75 – 90.63

37.5 – 90.63

14.5

14.4

11.8

16.9

Standard Deviation

The pretest mean for the experimental group, Group 1, was 39.0 and the mean for the control group, Group 2, was 38.3. Student grades improved from the pretest to the posttest. The minimum score on the posttest was 34 and the maximum score was 95.3 out of 100. Group 1’s mean posttest score of 74.3 was higher than Group 2’s mean posttest score of 67.2. The median grade for Group 1 was 75 and the median grade for Group 2 64

was 71.8. The standard deviation of 11.8 for Group 1 compared to 16.9 for Group 2. The smaller standard deviation and the higher mean in the posttest scores of Group 1 indicated a more consistent improvement for Group 1 as a whole when compared to Group 2. Figure 3 below illustrates that difference.

39 

Figure 3. A comparison of pretest and posttest mean scores for group 1 and group 2.

Throughout the semester, eight quizzes were administered. These quizzes were each based on a specific unit in the course. The data for the quizzes is presented in the Table 4. Further analysis was necessary to determine if the differences were significant. The results of that testing will be presented in the next section of this chapter.

65

Table 4. Descriptive data for quizzes

Quiz1 Quiz2 Quiz3 Quiz4 Quiz5 Quiz6 Quiz7 Quiz8

Group

n

Mean

Std. Deviation

Experimental

58

82.7

17.39180

Std. Error Mean 2.28366

Control

55

87.3

12.03420

1.62269

Experimental

58

77.8

22.83217

2.99801

Control

55

83.5

14.63727

1.97369

Experimental

58

73.9

19.48972

2.55913

Control

55

76.5

16.60584

2.23913

Experimental

58

65.7

28.05465

3.68376

Control

55

84.8

17.44432

2.35219

Experimental

58

77.2

26.32539

3.45669

Control

55

78.3

20.07314

2.70666

Experimental

58

52.8

33.95657

4.45872

Control

55

72.0

19.41083

2.61736

Experimental

58

73.7

19.26410

2.52950

Control

55

76.9

16.80804

2.26640

Experimental

58

67.0

25.29699

3.32166

Control

55

86.3

14.28722

1.92649

The mean score on every quiz was higher for the Group 2 than for Group 1 as illustrated in Figure 4 below. This was an unanticipated result and will be discussed in the next section of the chapter.

66

Figure 4. Mean quiz scores for eight semester quizzes. For the experimental group, n= 58; for the control group n = 55. The control group scores higher than the experimental group in each quiz.

At the end of the course, the posttest was administered. The posttest was subdivided into eight sections with each section of eight questions testing one of the units in the course. For example, questions 1-8 on the posttest were related to the first unit and first quiz; questions 9-16 were related to the second quiz and second unit, etc. These sections were identified by the letters PT for posttest quiz. Based on the mean of these posttest unit scores, it was apparent that the experimental group scored higher on every posttest section as shown from the data presented in Table 5 below. Further analysis was necessary to determine of the differences were significant. The results of that testing will be presented in the next section.

67

Table 5. Descriptive data for post test (PT) quizzes

PTQ1 PTQ2 PTQ3 PTQ4 PTQ5 PTQ6 PTQ7 PTQ8

Group

n

Mean

Std. Deviation

Experimental

58

89.2

11.57742

Std. Error Mean 1.52019

Control

55

79.1

21.65792

2.92035

Experimental

58

87.3

10.85475

1.42530

Control

55

73.9

20.30906

2.73847

Experimental

58

81.5

18.32357

2.40600

Control

55

66.4

31.26136

4.21528

Experimental

58

74.4

28.52577

3.74562

Control

55

69.3

23.42469

3.15859

Experimental

58

84.7

12.39940

1.62812

Control

55

51.1

26.81736

3.61605

Experimental

58

63.1

20.86696

2.73997

Control

55

51.8

28.60756

3.85744

Experimental

58

54.1

21.63644

2.84100

Control

55

54.3

31.84327

4.29375

Experimental

58

58.2

21.53024

2.82706

Control

53

49.3

28.26839

3.92012

The results of the posttest sub-section quizzes presented an interesting contrast to the regular semester quizzes. A graphic which summarizes the result is included as Figure 5 below.

68

Figure 5. Mean posttest scores for eight units. For the experimental group, n= 58; for the control group n = 55. The experimental group was higher than the control group on seven of the eight quizzes. Although the mean quiz score remained relatively constant during the semester, the post test quiz score showed a definite downward trend. This could have indicated that the material increased in difficulty throughout the semester, but no measure of difficulty was included in this study.

Inferential Statistics and Hypothesis Testing

The research question addressed in this study was: To what extent does the use of clickers as an instructional strategy impact students’ level of achievement of pre-algebra skills?

69

The null hypothesis was: H1: There will be no difference between the academic achievement of students who use clickers as part of an instructional strategy and the control group. To interpret the data, a number of statistical procedures were performed. From the descriptive statistics, it was determined that students scored higher on the posttest than the pretest. The data indicated that both groups of students generally demonstrated achievement in the course, regardless of whether the explicit questioning strategy using clickers was included in the instruction. It also appeared that the group which received the explicit questioning instruction strategy using clickers showed a higher degree of improvement. To determine if the increase in achievement was statistically significant, several one-way ANOVAs were performed on the pre- and posttest data. These tests indicated that, while there was no statistically significant differences between the groups on the pretest (Table 6, p = 0.736), there was a significant difference between the groups on the posttest (Table 6, p <0.05). With a statistical difference between the groups established, the next step was to examine the subtests which made up the posttest to determine which, if any, of the scores on the individual unit tests were significantly different from each other.

70

Table 6. Repeated one way ANOVA table for posttest groups _______________________________________________________________________ Sum of Squares df Mean Square F Sig. ________________________________________________________________________ Pretest Between Groups 22.102 1 22.102 0.114 0.736 Within Groups

21520.014

111

Total

21542.116

112

Posttest Between Groups

1446.675

1

Within Groups

23459.035

111

Total

24905.710

112

193.874

1446.675

6.845*

211.343

________________________________________________________________________ *p<0.05 Quizzes were administered throughout the term. To determine if the observed differences in the semester quiz scores were significant, an independent samples t-test was performed. Quizzes 4, 6 and 8 were identified at the p < 0.01 level. The results of this test are included in Table 7 below. In an attempt to determine a reason for the difference in the semester quizzes, the researcher spoke to each of the instructors. One of the Control Group’s instructors deviated from the protocol by devoting an additional hour of instruction on fractions during the fractions unit. This could be a reason for an increased mean in semester Quiz 4 for the Control group. No other deviations from the prescribed protocol were noted.

71

Table 7. Independent Samples t test between groups for each of the quizzes Levene's Test for Equality of Variances F Sig.

Q Equal variances 1 assumed Equal variances not assumed Q Equal variances 2 assumed Equal variances not assumed Q Equal variances 3 assumed Equal variances not assumed Q Equal variances 4 assumed Equal variances not assumed Q Equal variances 5 assumed Equal variances not assumed Q Eq.variances 6 assumed Eq.variances not assumed Q Equal variances 7 assumed Equal variances not assumed Q Equal variances 8 assumed Equal variances not assumed

t-test for Equality of Means t

df

Sig. (2tailed)

Mean Difference

Std. Error Difference

95% Confidence Interval of the Difference

Lower

Upper

Lower

Upper

Lower

Upper

Lower

Upper

Lower

4.60 8

.034

1.615 1.630 1.584 1.602 -.747

111

.109

-4.56

2.828

1.03

101.7

.106

-4.560

2.804

111

.116

-5.75

3.629

97.7

.112

-5.75

3.589

111

.457

-2.55

3.414

10.17 10.12 12.94 12.87 -9.31

4.21

-.750

109.7

.455

-2.55

3.400

-9.28

4.18

4.303 4.354 -.236

111

.000*

-19.03

4.422

96.0

.000*

-19.03

4.370

111

.814

-1.04

4.421

27.79 27.70 -9.80

10.26 10.35 7.71

-.237

106.1

.813

-1.04

4.390

-9.74

7.66

3.679 3.729 -.939

111

.000*

-19.28

5.240

-8.89

91.5

.000*

-19.28

5.170

111

.350

-3.20

3.408

29.66 29.55 -9.95

3.55

-.943

110.2

.348

-3.20

3.396

-9.93

3.52

4.966 5.035

111

.000*

-19.33

3.893

90.9

.000*

-19.33

3.839

27.04 26.96

11.61 11.70

5.3

1.7

17.8

.27

19.9

2.8

36.4

.023

.192

.000

.606

.000

.096

.000

* p<.01

72

.9905 1.44 1.37

-9.01

In the previous section of this chapter, Figure 5 illustrated the subunit posttest scores from units 1 through 8. A further analysis of the posttest and the subunits which made up the posttest was warranted. A t test was done on the posttest and the posttest unit score data to determine whether these differences were statistically significant. The results of that test indicated that there was a difference in the Posttest quiz (PTQ) 1, 2, 3, 5 and 6. Levene's Test for Equality of Variances showed that equal variances could not be assumed. The complete results of this test can be found in Table 8. The null hypothesis under test was: there will be no difference between the academic achievement of students who use clickers as part of an instructional strategy and the control group. As a result of the t test performed, that data indicated that there was a significant difference in quiz scores of the experimental group for in five of the eight subunits. Table 8. Independent Samples t test for Posttest Unit Quizzes t-test for Equality of Means t

df

Sig. (2tailed)

Lower

Upper

Lower

Mean Difference Upper

Std. Error Difference Lower

95% Confidence Interval of the Difference Upper Lower

________________________________________________________________________ PTQ 1 PTQ 2 PTQ 3 PTQ 4

Equal variances not assumed Equal variances not assumed Equal variances not assumed Equal variances not assumed

3.078

81.557

.003*

10.1332

3.29233

3.58320

16.6832

4.347

81.550

.000*

13.4208

3.08718

7.27894

19.5627

3.111

86.245

.003*

15.1018

4.85360

5.45363

24.7501

1.028

108.81

.306

5.03527

4.89962

4.67582

14.7463

73

Table 8. Independent Samples t test for Posttest Unit Quizzes (continued) ________________________________________________________________ t-test for Equality of Means t

df

Sig. (2tailed)

Lower

Upper

Lower

Mean Difference Upper

Std. Error Difference Lower

95% Confidence Interval of the Difference Upper Lower

________________________________________________________________ PTQ 5 PTQ 6 PTQ 7 PTQ 8

Equal variances not assumed Equal variances not assumed Equal variances not assumed Equal variances not assumed

8.463

75.186

.000*

33.5619

3.96568

25.6621

41.4616

2.394

98.485

.019*

11.3283

4.73152

1.93940

20.7173

-.043

94.477

.965

-.22335

5.14855

10.4452

9.99854

1.631

101.40

.106

7.68966

4.71331

1.65984

17.0391

________________________________________________________________ The data was tested for normality in preparation for further testing by using the Kolmogorov-Smirnov and the Shapiro-Wilk tests found in Table 9 below. However, the results of those tests indicated that, with the exception of the pretest, the remaining data did not follow a normal distribution. For both the Kolmogorov-Smirnov test and the Shapiro-Wilk test, a p value of less than 0.05 indicated that the distribution was significantly different from a normal distribution. The results indicated that only the pretest had a normal distribution. Based on the results of the Kolmogorov-Smirnov test and the Shapiro-Wilk test, the decision was made to analyze the data using the MannWhitney U test. The use of the Mann-Whitney U test is a more accurate test to explain the difference between groups when a normal distribution is not present. Based on the results of the Mann-Whitney U, there was a significant difference in the Posttest quiz 74

(PTQ) 1, 2, 3, 4, 5, 6 and the Posttest. The results of the Mann-Whitney U test are found in Table 10 below. Since the results of the Mann-Whitney test differed from the independent samples t test, another test was warranted. Table 9. Tests of Normality ________________________________________________________________________ Group

Pretest Posttest Quiz1 Quiz2 Quiz3 Quiz4 Quiz5 Quiz6 Quiz7 Quiz8

Kolmogorov-Smirnov

Shapiro-Wilk

Statistic

df

Sig.

Statistic

df

Sig

Experimental

.092

58

.200

.976

58

.313

Control

.113

55

.077

.970

55

.190

Experimental

.139

58

.007

.960

58

.052

Control

.116

55

.062

.944

55

.003*

Experimental

.264

58

.000*

.833

58

.000*

Control

.182

55

.000*

.885

55

.000*

Experimental

.217

58

.000*

.828

58

.000*

Control

.153

55

.003*

.898

55

.000*

Experimental

.188

58

.000*

.915

58

.001*

Control

.174

55

.000*

.931

55

.004*

Experimental

.147

58

.003*

.911

58

.000*

Control

.199

55

.000*

.809

55

.000*

Experimental

.277

58

.000*

.751

58

.000*

Control

.162

55

.001*

.888

55

.000*

Experimental

.178

58

.000*

.863

58

.000*

Control

.233

55

.000*

.906

55

.000*

Experimental

.194

58

.000*

.914

58

.001*

Control

.182

55

.000*

.895

55

.000*

Experimental

.205

58

.000*

.904

58

.000*

Control

.259

55

.000*

.823

55

.000*

________________________________________________________________________ * p < 0.05 75

Table 9. Tests of Normality (continued) ________________________________________________________________________ Group Kolmogorov-Smirnov Shapiro-Wilk

Quiz2 Quiz3 Quiz4 Quiz5 Quiz6 Quiz7 Quiz8 PTQ1 PTQ2 PTQ3 PTQ4 PTQ5 PTQ6 PTQ7 PTQ8

Statistic

df

Sig.

Statistic

df

Sig

Experimental

.217

58

.000*

.828

58

.000*

Control

.153

55

.003*

.898

55

.000*

Experimental

.188

58

.000*

.915

58

.001*

Control

.174

55

.000*

.931

55

.004*

Experimental

.147

58

.003*

.911

58

.000*

Control

.199

55

.000*

.809

55

.000*

Experimental

.277

58

.000*

.751

58

.000*

Control

.162

55

.001*

.888

55

.000*

Experimental

.178

58

.000*

.863

58

.000*

Control

.233

55

.000*

.906

55

.000*

Experimental

.194

58

.000*

.914

58

.001*

Control

.182

55

.000*

.895

55

.000*

Experimental

.205

58

.000*

.904

58

.000*

Control

.259

55

.000*

.823

55

.000*

Experimental

.268

58

.000*

.793

58

.000*

Control

.287

55

.000*

.824

55

.000*

Experimental

.232

58

.000*

.850

58

.000*

Control

.149

55

.004

.924

55

.002*

Experimental

.181

58

.000*

.853

58

.000*

Control

.191

55

.000*

.877

55

.000*

Experimental

.298

58

.000*

.803

58

.000*

Control

.287

55

.000*

.756

55

.000*

Experimental

.314

58

.000*

.808

58

.000*

Control

.156

55

.002*

.911

55

.001*

Experimental

.198

58

.000*

.912

58

.000*

Control

.110

55

.096

.954

55

.035

Experimental

.168

58

.000*

.948

58

.005*

Control

.197

55

.000*

.857

55

.000*

Experimental

.166

58

.000*

.958

58

.005*

Control

.175

55

.000*

.892

55

.000*

________________________________________________________________________ * p < 0.05 76

Table 10. Mann-Whitney U Test Statistics. Variable: Group ________________________________________________________________________

MannWhitney U Asymp. Sig. (2-

PTQ1

PTQ2

PTQ3

PTQ4

PTQ5

PTQ6

PTQ7

PTQ8

Posttest

1177.50

976.0

1217.5

1192.0

357.5

1191.5

1479.5

1416.0

1242.0

.011*

.000*

.027*

.017*

.000*

.019*

.502

.297

.043*

*p<0.05

To test the significance of the PTQ4, and to verify the significance of the other posttest unit quizzes, the Kruskal-Wallis test was run. The Kruskal-Wallis Test is a nonparametric test that can be used where normality assumptions may not apply. The results of that test are included below in Table 11. Based on the Kruskal-Wallis analysis, Posttest quizzes 1, 2, 3, 4, 5, and 6 had p <.05. The significance of this test is that it confirms that PTQ4 belongs in the group of posttest quizzes where the experimental group did significantly better than the control group.

Table 11. Kruskal-Wallis Test Results _______________________________________________________________________ PTQ1 PTQ2 PTQ3 PTQ4 PTQ5 PTQ6 PTQ7 PTQ8 ________________________________________________________________________ Asymp.Sig. 0.001*

0.000*

0.027* 0.017*

0.000* 0.019*

0.502

0.297

*p <0.05

Based on the results of the three analyses, the experimental group was shown to have scored significantly higher than the control group on Posttest quizzes 1, 2, 3, 4, 5 and 6. There was no significant difference between the experimental and control groups on posttest quizzes 7 or 8. The null hypothesis stated that there will be no difference 77

between the academic achievement of students who use clickers as part of an instructional strategy and the control group. The data showed that there was a significant difference between the groups for six of the eight units. The null hypothesis was rejected for subunit tests 1, 2, 3, 4, 5, and 6. Since the data showed no significance for posttest unit quizzes 7 and 8, the null hypothesis was not rejected for those two units. In an effort to further explore the significance of this finding as it applied to other factors, an ANOVA analysis was used to look at the posttest results related to group, gender, full time-part time status and ethnic background. The groups were identified as experimental and control. Based on Table 12 below, the only significant factor found in this study was whether the students were in the experimental or control group. Gender, part-time/fulltime status or ethnic background produced no statistical significance in this study.

78

Table 12. Comparing group, gender, ethnic and full-time/part-time status _______________________________________________________________________ Sum of df Mean F Sig Squares Square _______________________________________________________________________ Group Between 19.947 69 .289 1.501 .047* Groups Within Groups 8.283 43 .193

Gender

FT/PT

Ethnic

Total

28.230

112

Between Groups Within Groups

15.505

69

.225

9.717

43

.226

Total

25.221

112

Between Groups Within Groups

15.241

69

.221

12.033

43

.280

Total

27.214

112

Between Groups Within Groups

164.921

69

2.390

72.300

43

1.681

Total

.994

.517

.789

.812

1.422

.109

112

Finally, a repeated measure General Linear Model (RM GLM) was performed to determine if there were any significant within-subject effects when the measure was “Group.” Based on the results of those tests which were listed in Table 13 below, there was no significant difference within the group. The additional test was added to confirm the results presented in Table 12. 79

Table 13. Tests of Within Subjects Effects Using the Measure Groups. _______________________________________________________________________ Source Sum of df Mean F Sig Squares Square ________________________________________________________________________ Post

Sphericity Assumed

108700.304

7

15528.615

51.404

.000

quizzes

Greenhouse-Geisser

108700.304

6.047

17976.136

51.404

.000

Huynh-Feldt

108700.304

6.430

16904.584

51.404

.000

Lower-bound

108700.304

1.000

108700.304

51.404

.000

Sphericity Assumed

236840.321

784

302.092

236840.321

677.255

349.706

Huynh-Feldt

236840.321

720.185

328.860

Lower-bound

236840.321

112.000

2114.646

Error (Post Quizzes)

Greenhouse-Geisser

Impact on weak students A closer look at the data suggested that the weakest students made the most significant gains in the course. To explore the significance for the weaker student further, a histogram was created based on the student’s pretest score and the percentage receiving 70 or higher on the posttest score. The value of 70 was selected because it usually represents a passing grade C. The data for the histogram in Figure 6 was recorded in Table 14 below. This data indicated that the student who began the course with a grade of 20% or higher was more likely to pass the course as a member of the experimental class, than as a member of the control class. 80

Table 14: Percent Passing Posttest Compared to Pretest Grade _____________________________________________________ Number Number Percent Percent Range

Experimental

Control

Experimental Control

_____________________________________________________ 10-19

6

4

17

0

20-29

10

14

70

28

30-39

16

11

75

40

40-49

16

11

88

45

50-59

8

10

86

86

60-69

2

5

100

100

58

55

Total

______________________________________________________

Figure 6.: Percent of students passing posttest versus pretest score

This histogram illustrates that 70 percent of the students who had a pretest score between 20 and 29 passed the course as members of the experimental class, while only 81

35% of the students from the control group who had similar scores on the pretest passed the posttest. This trend continued for students who received less than 50 on the pretest. In contrast, the student who began with a pretest score of 50 or higher performed equally well in either the control or the experimental class. The numbers used in this histogram are small so further research is suggested to determine the significance of this trend. Summary The study assessed the achievement of students enrolled in a remedial level PreAlgebra course at a community college. The first consideration was the descriptive statistics for the data. Although the pretest was shown to have normality, equivalence of variance and independence, much of the other data did not meet those assumptions. Because of this, a number of analyses were performed to examine the reliability of the results. The Kolmogorov-Smirnov and the Shapiro-Wilk analysis indicated that the condition of normality was not met for the posttest or the posttest unit quizzes. Levene’s test of equal variances indicated that equal variances were only present for the pretest. The post quiz data was analyzed with three tests: t-test, Kruskal-Wallis Test, and the Mann Whitney-U. Based on the analysis of the three tests, post unit quizzes 1, 2, 3, 4, 5 and 6 were determined to be significantly higher with the experimental group than for the control group. The data suggested a closer look at the individual achievement of students. A histogram was created based in Figure 6 on the student’s pretest score and the percentage of students receiving 70 or higher on the posttest score. This histogram illustrated that a 82

student who scored below 50 on the pretest had a better chance of passing in the experimental classroom than in the control classroom, but the numbers were low suggesting additional research. The explict questioning strategy using clickers resulted in a statistically significant increase in student achievement in the beginning algebra course. The null hypothesis was rejected for six of the eight units and the posttest. Chapter 5 will discuss the implications of using a strategy of explicit instruction with clickers which can be based on the results of this study.

83

CHAPTER 5: RESULTS, CONCLUSIONS, AND RECOMMENDATIONS   Restatement of the Problem In the previous chapters, the significance of the study was discussed by drawing special attention to the large numbers of community college students enrolled in developmental mathematics courses. Data from recent college research indicated that more than 90% of students who took the math placement tests at the community college required at least one level of developmental math (Achieving the Dream Proposal, 2007). Students’ low achievement rate was identified as a common problem in community colleges (National Center for Education Statistics, 2004).The literature suggested that interactions among students and/or faculty and engagement with content are important components of courses for developmental students (Halpin, 1990; Tinto, 1993, 2005). Studies showed that personal response system, commonly called a clicker, was effective in increasing both interaction and engagement in the classroom (Draper& Brown, 2004; Wood, 2004, Campbell, 2007; Li, 2007). However, existing literature on the use of personal response systems in developmental math courses was sparse in revealing the effects that engagement with clickers had on achievement (Li, 2007). The problem addressed in this study was to determine if the use of an explicit questioning strategy using clickers could improve academic achievement in a mathematics course. A review of the literature affirmed the importance of designing courses which incorporated strategies to help students master the fundamentals. The literature guided the course design and the implementation of technology. 84

Review of Literature Literature from the fields of special education, educational psychology and instructional design was examined in order to design a program which could be used by developmental students and to increase student achievement in developmental mathematics. From the literature in special education, it was evident that students who had difficulties in the content area benefitted from explicit instruction. Explicit instruction is most often selected when students are required to master a broad spectrum of skills or to be evaluated by standardized tests (Hirsh, 1996). Rachal, Daigle, and Rachal (2007) determined that college students needed explicit instruction on learning strategies to allow them to be successful in classes. Regardless of their academic classification, explicit instruction in learning strategies for reading, writing and mathematics was required by most students at every level. The National Research Council noted in Sheffield and Cruickshank (2005) that explicit instruction often acted as the foundation for students who have difficulties in mathematics. The five principles of explicit instruction included big ideas, conspicuous strategies, guided practice with scaffolding, strategic integration and judicious review. These five principles became the foundation for each lesson in the pre-algebra course. From the areas of educational psychology and instructional design came the components of active engagement and formative assessment. The National Research Council’s Committee on Increasing High School Engagement and Motivation to Learn (2004) indicated that active engagement was required for learning and success in school. Engagement consisted of behaviors, emotions, social connections and cognitive strategies. Mayer (2008) described active cognitive engagement in greater detail. During 85

cognitive engagement the student not only processed the information, and remembered the strategies but also developed their beliefs about themselves as learners. Mayer demonstrated that achievement directly depended on the learner’s cognitive activity. Based on this research, polling devices, commonly called clickers, were included as part of the strategy to increase student engagement. The research on formative assessment suggested by Clark (2005) and Stiggins (2006) provides convincing evidence of the value of such assessment for students with difficulties in a content area. Reigeluth and Carr-Chellman (2009) suggested that technology can provide the formative assessments and feedback necessary for the information-age paradigm of education. When technology is available to record the student response and simultaneously give the student instant feedback, learning is enhanced. Emberger (2002) indicated that to be effective, feedback should be: corrective in nature, timely, and specific. By using the polling device, formative assessment occurs with every student response. Polling devices by Turning Point were selected as the technology to implement this study. The majority of research on polling devices focused on the student’s perceptions of the system and how enjoyable and helpful they found the system. There had been little empirical research on the student achievement associated with the audience response system (Mayer, 2008; West, 2005). One study, which did relate to the use of clickers to achievement in the classroom, came from the physics department of Ohio State University (Li, 2007). This study used a form of rapid fire questioning at the beginning of each class and a series of conceptual questions later in the lesson. The Li study served as a model upon which to build this study but significant modifications were 86

necessary. Unlike the original work which was done with academically well prepared physics students at a four year institution, this study aimed for a strategy to address the needs of developmental mathematics students at a community college. Two major differences occurred with this mathematics study and Li’s physics study. First, the rapid fire questioning technique was replaced by review questions. These review questions focused on the key understandings of the previous class. These questions were individually presented, polled and discussed, sometimes in great detail. Second, even though Li indicated that students did not want to use the clickers for extended periods of time during the class, the current study used the clickers for every class, all period long. Students never complained, and attendance remained high for every class throughout the semester. Review of Methodology This study was conducted to respond to the challenge of teaching developmental mathematics at the community college level and would be defined as design and development research according to Richey and Klein (2007). The study was based on a non-equivalent group quasi-experimental design. It incorporated Krathwohl‘s (2004) AB-A-B variations between the pre and post test similar to a method used by Lewis (2002). This design allowed the researcher to collect data for the groups over time. “A” represented the clicker strategy and “B” represented a quiz. “O” represented a Pre and Post test. The sequence for the experimental group was OABABABABABABABABO, where the first O represented the pretest and the last O represented the posttest. This sequence included one pretest, eight instructional experiences using the questioning 87

clicker strategy, eight quizzes and one posttest. The sequence for the control group included O-B-B-B-B-B-B-B-B-O. This sequence included one pretest, eight instructional experiences without the clicker strategy, eight quizzes and one posttest. This study included participants from four community college developmental mathematics sections in a program using an explicit questioning strategy with clickers. The clickers were included to keep the students engaged, guide the direction of instruction and provide immediate feedback on student achievement to both the students and the instructor. The program was designed to cover all of the material required in the traditional one semester developmental mathematics course of pre-algebra. Control and Experimental Groups All of the instructors were given the same syllabus, the same examples to use in class and the same homework problems. The only difference in instruction was that the experimental class used an explicit questioning strategy with clickers during the class and the control class responded to instructor questions in the traditional way. Both groups completed the same amount of material and took the same quizzes and tests. There was no significant difference the pre-test scores of the two groups. However, this study revealed that the experimental group performed significantly better on the posttest than the control group. When factors like gender, ethnic background and full-time or part-time status were analyzed, there was no significant difference between the groups. Research Question There was one research question for this study “To what extent did the use of clickers as an instructional strategy impact students’ level of achievement of pre-algebra 88

skills?” At the start of the study, there was no apparent academic difference between the groups based on the pre-test. Each class was exposed to the same lessons, examples and homework. The only difference was that the experimental class incorporated an explicit questioning strategy using clickers during every class. After each unit, there was a unit quiz. There were eight quizzes during the semester for the eight units of the course. Based on the data analyses, the control group outscored the experimental group significantly in three of the eight quizzes. At the end of the semester a post-test was given to each group. In addition to an average posttest score, the post-test was divided into eight scored sections. These question groups were identified as posttest scores for the eight units of the course. Based on the results of the data analyses, the experimental group scored significantly higher than the control group on the first six of the eight posttest unit quizzes. There was no significant difference between the experimental and control groups on posttest quizzes 7 or 8. The study raised the question, “Why did the participants in the control group score higher on the quizzes occurring throughout the semester, but lower on the questions at the end of the course?” To answer this question, the instructors were polled to see if there was any change in protocol during the semester. One instructor in the control group admitted to spending an additional day on the fraction unit. This could explain why there was a significant difference in the semester quiz for the unit four quizzes on fractions. There seemed to be no other changes in protocol during the study. In Figure 7 below, the semester quiz score is plotted along side the posttest unit score. For all units, the control student scored higher in semester quiz than in the posttest unit quiz. Further analysis did not identify a reason for the higher score of the control group on the semester quizzes. 89

Figure 7. Comparisons of semester quizzes and posttest unit quizzes for the control group Students in the control group did not seem to retain the information throughout the semester as well as the students in the experimental group. According to the data, the students in the experimental group actually did better in six out of the eight unit posttest quizzes than they had done in the semester quizzes. Figure 8 below illustrates those scores. The results would support research which indicated that active engagement using technology increases academic achievement and may even increase the student’s ability to learn (Leppar & Malone, 1987). There is no indication from the data or instructor interviews why the trend did not continue for the last two units in the course.

90

Figure 8. Comparisons of semester quizzes and posttest unit quizzes for the experimental group The data analysis yielded another interesting fact. Weak students had a better chance of passing the posttest if they were part of the experimental class. Students whose pretest scores were between twenty and thirty percent were twice as likely to pass in the experimental class compares to similar students in the control class. Here again the questioning strategy using clickers engaged the students and increased their confidence level. When a student committed to an answer, the student not only wanted to know if he/she is correct. In addition, the student wanted to know how to do it correctly. The clickers stimulated more class discussion and pushed the students to incorporate higher level thinking. The results indicate that there is sufficient evidence to conclude that students enrolled in a developmental math class which use an explicit questioning strategy with clickers performed better academically than students not using this strategy.

91

Relating the Findings to the Research The findings resulted from the study are summarized below: 1. Students in the experimental class using the explicit questioning strategy with clickers scored significantly higher on the posttest than the control group. 2. Students in the experimental group scored significantly higher than the control group in six of the eight posttest unit quizzes. 3. The only significant factor influencing the difference in pretest and posttest unit scores was the treatment group (experimental or control), and not gender, ethnic background or the full or part-time status of the student. 4. Students who scored between 20 and 49 on the pretest had a significantly better chance of passing the course as members of the experimental class rather than as members of the control class. 5. Students who scored above 50 on the pretest had similar passing rates in both the control and the experimental classes. The findings of this research are discussed below in terms of their implications for institutional practice. The first finding was that students in the experimental class using the explicit questioning strategy with clickers scored significantly higher on the posttest than the control group. The results of this study confirmed the results of Draper and Brown’s (2004) study on classroom dynamics. However, in addition to learning whether the answer was correct, each instructor in the experimental group was instructed to review every problem after taking the poll. The process of hearing the problem spoken 92

and solved provided additional explanation and reinforcement for students in the experimental group. The implication for instructional design is specific to the developmental students. This group of students needs repetition and multimodal stimuli. Reviewing each problem allows the students to better understand each problem. This recommendation differs from the finding of Li (2007) who noted that advanced students found problem review to be tedious. In contrast, the developmental student appreciated problem review even if most of the students in the class selected the correct answer. The second finding was that students in the experimental group scored significantly higher than the control group in six of the eight posttest unit quizzes. This finding represented mixed results. In the other two posttest units, the experimental group outscored the control, but the values were not statistically significant. To determine the reason for the possible difference, the instructors of each section were interviewed by the researcher. There were no procedural discrepancies during the final three units, so it is difficult to say why there was no significant difference in the scores. One possible explanation for the higher achievement could be that the experimental group was exposed to a higher-level thinking than the control group. In a qualitative study by Nelson and Hauck (2008), the higher order thinking which results from engaging the students with the clicker technology could help explain the results. Lewis (2004) confirmed Mouton’s (1988) findings that success on lower level testing, like quizzes, can be accomplished by reviewing higher order learning questions during practice assignments. The finding of the Mouton study indicated that a more stable and durable memory trace would result for the student who used deeper cognitive processing during the encoding procedure. The Nelson and Hauck, Lewis and Mouton studies could 93

explain why the experimental students significantly outscored the control group in the early units because of a higher cognitive engagement. It does not explain why the effect did not persist for the final quizzes. The researcher can only speculate that the cognitive processing would increase the retrieval of information from the long term memory – in this case four weeks and beyond. The final two quizzes taken during the last two weeks of class did not rely on this long term memory. The third finding was that the only significant factor influencing the difference in pretest and posttest unit scores was the treatment group (experimental or control), and not gender, ethnic background or the full or part-time status of the student. This finding contradicted the findings of Campbell (2007). Unlike the Campbell study, females did benefit as much as males. Gender was not a significant factor in posttest scores. In contrast, this result confirmed the Li (2007) study which suggested that women were very comfortable with technology. It was also suggested that women may feel more comfortable participating anonymously with clicker. The only significant difference was identified by treatment group, not ethnic background, fulltime/part time status or gender. The students who participated in the experimental group scored significantly higher on the posttest than students who were part of the control group. The fourth finding was that students who scored between 20 and 49 on the pretest had a significantly better chance of passing the course as members of the experimental class rather than as members of the control class. This finding confirms the research Gersten, Baker, and Marks, (1998) who found that students experiencing difficulties in math who are presented with explicit instruction learn the fundamentals more rapidly and are more successful than students who are taught using a constructivist strategy. Merrill 94

(2000) suggested that instructional strategies should be determined primarily on the basis of the content to be taught and secondarily on the learner styles and preferences. When the content is developmental math, the explicit questioning strategy using clickers has been shown to be an effective way to help students learn the information. The final finding was that students who scored above 50 on the pretest had similar passing rates in both the control and the experimental classes. This result was not surprising. These students were close to passing at the start of the semester. With a little review and practice, they were able to review the skills needed to pass the course. It is hard to determine the effect of the explicit strategy using clickers for these students. Considering only the polling devices, research indicates that these students may have increased their enjoyment and attendance because of the use of clickers. Beekes (2006) reported that the students found the polling system easy to use and it had increased their enjoyment of the lectures. Wit (2003) studied the polling system in an undergraduate statistics course and found that students had a positive perception of the device. Wit also recorded an increase in class attendance from the previous year without the polling devices Limitations Several limitations of this study must be considered. First, the participants in this study were not randomly assigned to the four course sections. Although the pretest identified the groups as equivalent academically at the beginning of the semester, factors such as work and family responsibilities, which could influence the group, were not considered. 95

Second, the results of this study were generalized to developmental math students in a specific community college for a specific term. Developmental students at a different community college could have different academic skills and weaknesses. Recommendations for Further and Future Study The findings of this study support the general notion that instruction based on an explicit questioning strategy with clickers can increase achievement in developmental level community college mathematics courses. The generally positive trends in these areas indicate that studies extending the questioning strategy to follow-up courses might show increased student retention and performance. While this study demonstrated that explicit instruction is very appropriate for beginning level students, the technique may need to be modified as the student progresses to higher level math. To take students to a higher level of cognition, new strategies may need to be incorporated into the lesson. New features may also need to be incorporated into the technology. A challenge for the instructional designer is to design instruction that transitions the developmental math student from explicit instruction on basic concepts to less explicit instruction of abstract concepts of physics and college algebra. One area which was not addressed in this study was the students’ self-efficacy. It was evident on an anecdotal level that many of the students changed during the semester. Students in the experimental classes seemed much more confident in their math skills and even looked forward to the next required math class. It would be interesting to determine if the clickers impacted the students’ self-efficacy.

96

Other developmental areas in the community college are reading and writing. Using an explicit questioning strategy with clickers in these areas should also be beneficial to the students. To date, little research had been done in these areas. However, this strategy does not have to apply only to developmental courses. All academic subjects have their own set of vocabulary, procedures and principles. Without a strong understanding of the fundamentals of the course, the student will not be able to master the abstract ideas. This fact indicates that an explicit questioning strategy with clickers has a place in most college courses today. Designers of online courses today are constantly looking for ways to engage the student. The technology exists to take this strategy online for courses like developmental mathematics where the student is hesitant to speak out for fear of being incorrect. There are currently some applications such as ElluminateLive, where the student can participate in active polling, but it must be synchronous. By loading the polling data from an inclass course, the online student can answer a polling question and see the class results, thus becoming as engaged as the in-class student. Reigeluth (2009) stated that instructional systems must meet the needs of their suprasystem and proposed an informational-age education system which takes into account that different students learn at different rates and need different instructional methods. One way to move ahead is to incorporate the advantages of the explicit strategy using clickers into a personalized learning plan for the student based on achievement criteria, not time spent on the subject. By encouraging the student to respond to questions in an individualized environment which presents the polling results of a typical in class section, creates a community for the student while showing the student that others can respond in a similar way. 97

Conclusion

The National Center for Education Statistics (2004) indicated that the number of community college students requiring developmental work in mathematics was extremely high. At “Achieving the Dream” Community Colleges, up to 90% of students need some level of mathematical remediation to prepare them for college level mathematics work. If students cannot advance out of developmental mathematics, their choices of study and career plans will be impacted (Singh, Granville & Dika, 2002). This study showed that a program of explicit questioning strategies using clickers in a community college developmental mathematics course increased academic achievement, providing a valuable option for community college faculty, mathematics program directors and developmental program directors to consider in improving performance of students participating in developmental level mathematics courses. Reigeluth (2009) identified three criteria to evaluate how well a method works in achieving instructional outcomes: effectiveness, efficiency and appeal. Effectiveness requires an appropriate level of academic indicator to objectively identify the learning outcomes. Efficiency requires an optimization of resources, such as time, and money to achieve the desired results. Appeal is the degree to which the learners enjoy the instruction. Based on these criteria, an explicit questioning strategy using clickers worked in achieving the institutional outcomes. While further study is suggested with regard to the lasting power of these results, and the continued success of these students in other courses, the general findings of this study showed positive results. With so many community college students currently at risk 98

in developmental mathematics courses, the consideration to include a program with an explicit questioning strategy program using clickers is warranted and may prove to have a great impact to a large population. But the implication is not just limited to the developmental students at the community college; it can be implemented in every classroom and online to reach all of today’s students.

99

REFERENCES Achieving the Dream Proposal. (2007). Bethlehem, PA: Northampton Community College. Achieving the Dream Data. (2008). Bethlehem, PA: Northampton Community College. Achieving the Dream. (2008). Retrieved on December 2, 2008 from www.achievingthedream.org. Bangert-Drowns, R. (1991). The instructional effect of feedback in test-like events. Review of Educational Research, 61(2) 213-238. Beaton, A. E., Mullis, I. V. S., Martin, M. O., Gonzalez, E. J., Kelly, D. L., &. Smith, T. (1997). Mathematics Achievement in the middle school years: IEA's third international mathematics and science report. Chestnut Hill, MA: Boston College. Beekes, W. (2006). The “millionaire” method for encouraging participation. Active Learning in Higher Education. 7, 25-36. Bergin, D. A., Ford, M. E., & and Hess, R. D. (1993). Patterns of motivation and social behavior associated with microcomputer use of young children. Journal of Educational Psychology, 85, 437-445. Berlyne, D. E. (1960). Conflict, arousal and curiosity. New York: McGraw-Hill. Bittenger, M.L., Ellenbogen, D.L., & Johnson, B. L. (2008). (Pre-algebra Mathematics Series for Higher Education). Boston: Pearson/Addison Wesley). Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education, (1) 7-74. Bonwell, M. (1996). Building a supportive climate for active learning. The National Teaching and Learning Forum, 6(1) 4-7. Bransford, J. D., Browning, A. L., & Cocking, R. R. (2000). How people learn: Brain, mind, experience and school. Washington, DC: National Academy Press. Burnstein, R. ,& Lederman, L. (2001). Using wireless keypads in lecture classes. The Physics Teacher, 39, 8-11. http://www.replysystems.com/pdfs/benefits/24.pdf

100

Bryant, D., Hartman, P., & Kim, S. A. (2003).Using explicit and strategic instruction to teach division skills to students with learning disabilities. Exceptionality, 11 3439. Campbell, J. E. (2007). Increasing learning in a college classroom: Is it just a click away. (Doctoral dissertation, University of California, Santa Barbara, 1990). Retrieved February 24, 2009, from Dissertations & Theses: Full Text database. (Publication No. AAT 3285821). Carnine, D. (1997). Instructional design in mathematics for students with learning disabilities. Journal of Learning Disabilities, 30, 130 - 141. Clark, R. E. (2005). Learning from media: arguments, analysis and evidence. Greenwich, CT: Information Age. Creswell, J. W. (2003). Research design: Qualitative, quantitative, and mixed methods approaches (2nd ed.). Thousand Oaks, CA: Sage. Creswell, J. W. (2005). Educational research: Planning, conducting and evaluating quantitative and qualitative research. (2nd ed.). Upper Saddle River, NJ: Pearson. Draper. S. W., & Brown, M. I. (2004). Increasing interactivity in lectures using an electric voting system, Journal of Computer Assisted Learning. 20, 81-94. Duncan, D. (2005). Clickers in the classroom: How to enhance science learning using classroom response systems. San Francisco: Pearson/Addison Wesley. E-Instruction. (2008) Retrieved on November 12, 2008 from http://www.einstruction.com/ Emberger, R. (2002), Focused feedback. Maryland Classroom 7(3), 7-12. Fontana, D., & Fernandes, M. (1994). Improvements in mathematics performance as a consequence of self-assessment in Portuguese primary school pupils. British Journal of Educational Psychology, 64(3), 407-417. Fuchs, L.S. & Fuchs, D. (1968). Effects of systematic formative evaluation: A meta analysis. Exceptional Child. 53(3), 199-208 Gamson, Z.F (1997). Seven principles for good practice. AAHE Bulletin, 39(7), 3-7. Gersten, R., Baker, S. K., & Marks, S. U. (1998). Teaching English-language learners with learning difficulties: Guiding principles and examples from research-based practice. (ERIC Clearinghouse: ED 427 448) Gibbons, A.S., & Rogers, P.C. (2009). The architecture of instructional theory. In C.M. Reigeluth & A.A. Carr-Chellman (Eds.) Instructional design theories and models 101

(Volume III): Building a common knowledge base (pp. 305-326). Mahwah, NJ: Lawrence Erlbaum Associates. Goldsmith, L. T., & Mark, J. (1999, November). What is a standards-based mathematics curriculum? Educational Leadership, 57(3), 40-45. Halpin, R. (1990, Spring). An application of the Tinto model to the analysis of freshman persistence in a community college. Community College Review, 17(4), 22-32. Retrieved February 12, 2009, from Academic Search Premier Database. Handelsman, J. (2004). Scientific Teaching. Science 23(304). 521 – 522. Harniss, M.K., Carnine, D.W., Silbert, J., & Dixon, R.C. (2002). Effective practices for teaching mathematics. In E.J. Kame’enui, D. W. Carnine, R. C. Dixon, D. C.Simmons, & M. D. Coyne (Eds.), Effective teaching strategies that accommodate diverse learners (2nd ed) (pp. 121-148), Upper Saddle River, NJ: Merrill. Hatch, J., Jensen, M. & Moore, R. (2005). Manna from heaven or clickers from hell: Experience with electronic response systems. Journal of College Science Teaching, 34, 36-39. Heritage, M. (2007). Formative assessment: What do teachers need to know and do? Phi Delta Kappan, 140 – 145. Hidi, S. (2000).An interest researcher’s perspective on the effects of extrinsic and intrinsic factors of motivation. In C. Sansone, & J. M. Harackiewicz (Eds.). Intrinsic and extrinsic motivation: the secret for optimal motivational performance. New York: Academic Press. Hiltz, S. R., & Goldman, R. (2004). Learning together online: Research on asynchronous learning. Mahwah, NJ: Lawrence Erlbaum. Hirsh, E. D. (1996). The schools we need: And why we don’t have them. New York: Doubleday Hudson, P., Miller, S. P. & Butler, F. (2006), Adapting and merging explicit instruction with reform bases mathematics classroom. American Secondary Education, 35(1), 19 - 28. International Board of Standards for Training, Performance and Instruction (2000). 2000Instructional design competencies. Retrieved August 19, 2008, from http://stpi.org/Competencies/instruct_design_competencies.htm Interwrite. (2008). Retrieved on November 12, 2008 from http://www.interwritelearning.com/products/prs/index.html. 102

Jackson, H. G., & Neel, R. S. (2006). Observing mathematics: Do students with EBDhave access to standards-based mathematics instruction? Education & Treatmentof Children, 29(4), 593-604. Jones, E. D., Wilson, R. & Bhojwani, S. (1997). Mathematics instruction for secondary students with learning disabilities. Journal of Learning Disabilities, 30 (2), 151163. Jonassen, D. H., & Reeves, T. C. (1996). Learning with technology: Using computers as cognitive tools. In D. H. Jonassen (Ed.), Handbook of research for communications and technology (pp. 693-719). New York, NY: Prentice Hall International Krathwohl, D. R. (1993). Methods of educational and social science research: An integrated approach. Long Grove: Waveland Krathwohl, D. R. (2004). Methods of educational and social science research: An integrated approach (2nd ed.). Long Grove: Waveland. Kroesbergen, E. H., & Van Luit, J. E. H. (2003). Mathematics interventions for children with special educational needs. Remedial and Special Education, 24, 97 - 114 Krulik, S., Rudnick, J., & Milou, E. (2003). Teaching mathematics in middle school: A practical guide. Boston: Allyn & Bacon. Leppar, M. R. & Malone, T. W. (1987). Intrinsic motivation and instructional effectiveness in computer based education. In R. E. Snow & M. J. Farr (Eds.), Aptitude, learning and instruction: Vol. 3 Cognitive and affective process analysis ( pp. 255-286). Hillsdale, NJ: Lawrence Erlbaum Associates, Inc. Latessa, R., & Mouw, D. (2005). Use of an audience response system to augment interactive learning. Family Medicine, 37, 12-14. Lewis, B. (2002). Learning effectiveness: Efficacy of quizzes vs. discussions in on-line learning. Ph.D. dissertation, Syracuse University, United States -- New York. Retrieved February 24, 2009, from Dissertations & Theses: Full Text database. (Publication No. AAT 3046842). Li, P. (2007). Creating and evaluating a new clicker methodology. Ph.D. dissertation, The Ohio State University, United States -- Ohio. Retrieved February 24, 2009, from Dissertations & Theses: Full Text database. (Publication No. AAT 3275181). Mayer, R.E. (2008). Learning and instruction (2nd ed.). Upper Saddle River, NJ: Merrill Prentice Hall Pearson.

103

McConnell, M., & Bhattacharya,D.N. (1999). Using the elegance of arithmetic to enhance the power of algebra. The Mathematics Teacher, 92(6), 18-25. McDermott, L. C.,(1991). Millikan Lecture 1990: What we teach and what is learned? Closing the gap. American Journal of Physics 59, 301-315. Mercer, C. D., & Mercer, A. R. (2001). Teaching students with learning problems (6th ed.). New York: MacMillan Merrill, D. (2000). Instructional Strategies and Learning Styles: Which takes Precedence? In Robert Reiser and Jack Dempsey (Eds.) Trends and Issues in Instructional Technology. New York: Prentice Hall. Miller, S. P. (2002). Validated practices for teaching students with diverse needs and abilities. Boston: Allyn & Bacon. Miller, S. P., Harris, C.A., Strawser, S., Jones, W. P., & Mercer, C. D. (1998). Teaching multiplication to second graders in inclusive settings. Focus on Learning Problems in Mathematics, 21(4), 49-69. MHEC: Maryland Higher Education Commission (1996). A study of remedial education at Maryland public campuses. Annapolis, MD: Maryland Higher Education Commission Mouton, H. (1988). Adjunct Questions in Mediated Self-Instruction: Contrasting the Predictions of the “Levels of Processing” Perspective and “Transfer-Appropiate Processing” Perspective, and the “Transfer Across Levels of Processing” Perspective. Syracuse, New York: Doctoral dissertation. National Center of Academic Transformations (NCAT).Retrieved on November 2, 2008 from www.center.rpi.edu National Center for Education Statistics (2004). Remedial education at degree-granting post-secondary institutions in Fall 2000. Washington, DC: U.S. Department of Education. Retrieved September 4, 2009, from http://nces.ed.gov/pubs2004/ 2004010.pdf National Council for Teacher of Mathematics (NCTM). (1991). Professional standards for teaching mathematics. Reston, VA: Author. National Council for Teacher of Mathematics (NCTM). (1995). Assessment standards for school mathematics. Reston, VA: Author. National Council for Teacher of Mathematics (NCTM). (2000). Principles and standards for school mathematics. Reston, VA: Author.

104

National Council for Teacher of Mathematics. (NCTM). (1989). Curriculum and evaluation standards for school mathematics. Reston, VA: Author. National Research Council. (2004). Engaging schools: Fostering high school students' motivation to learn. Washington, D.C: National Academics Press. Nelson, M., & Hauck, R. (2008). Clicking to learn: A case study of embedding radiofrequency based clickers in an introductory management information systems course. Journal of Information Systems Education, 19(1), 55-64. Retrieved November 10, 2009, from ProQuest Education Journals. (Document ID: 1465593571). Piaget, J.& Inhelder, B. (1969). The psychology of the child. Basic Books. Penuel, W., Abrahamson, L. & Roschelle, J. (2004. The networked classroom. Educational Leadership, 61 (5) 50-54. Phillips, G.W. (2007). Chance favors the prepared mind: Mathematics and science indicators for comparing states and nations. Washington, DC: American Institute of Research. Printrick, P. R. ( 2003). A motivational science perspective on the role of student motivation in learning and teaching contexts. Journal of Educational Psychology, 95, 667-686. Rachal, C., Daigle,S. & Rachal, W.S. (2007). Learning problems reported by college students: Are they using learning strategies? Journal of Instructional Psychology 34, 4. Redish, E. F. (1994). Implications of cognitive studies for teaching physics. American. Journal of Physics, 62 (9). 796-803. Reigeluth, C. M. (1999). Instructional design theories and models (Volume II): A new paradigm of instructional theory. Mahwah, NJ: Erlbaum. Reigeluth, C. M (1983). Instructional design theories and models Mahwah, NJ: Lawrence Erlbaum Associates. Reigeluth, C. M. & Carr-Chellman, A. A. (Eds.). (2009). Instructional design theories and models (Volume III): Building a common knowledge base. Mahwah, NJ: Lawrence Erlbaum Associates. Richey,R.C. & Klein, J. D. (2007). Design and Development Research. Mahwah, NJ: Lawrence Erlbaum Associates Roschelle,J., Panuel, W. R., & Abrahamson, L. (2004). Classroom response and communication systems: Research review and theory. A paper presented at the 105

annual meeting of the American Educational Research Association, San Diego, CA. Roschelle,J., Panuel, W. R., Crawford, V. & Shechtman, N. (2004). Workshop Report: Advancing research on the transformative potential of interactive pedagogies and classroom networks. Retrieved February 14, 2009 from http://ctl.sri.com /publications/ downloads/CATAALYST_Workshop_Report.pdf Sadler, D. R. (1989) Formative assessment and the design of instructional systems. Instructional Science, 18, 119-144. Schaw, G. (1998). Processing and recall differences among seductive details. American Journal of Educational Psychology, 90, 3-12. Schiefele, U. ( 1996). Topic interest, text representation and quality of experience. Contemporary Educational Psychology, 21, 3-18. Schiefele, U. ( 1999). Interest and learning from text. Scientific Studies of Reading, 3, 2529. Sheffield, L. J., & Cruikshank, D. E. (2005). Teaching and learning mathematics prekindergarten through middle school (5th ed.). Hoboken, NJ: JohnWiley & Sons. Simmons, D. C. & Kameenui, E. J. (1996). A focus on curriculum design: where children fail. Focus on Exceptional Children 28 (7), 1-16. Stiggins, R. J. (2006). Balanced assessment Systems: Redefining excellence in assessment. Educational Testing Service. Policy Brief. Thiel, T., Peterman, S. & Brown, M. (2008). Addressing the crisis in college mathematics: Designing courses for student success. Change, 40 (4), 44-49. Tinto, V. (1993). Leaving college: Rethinking the causes and cures of student attrition(2nd ed.). Chicago: The University of Chicago Press. Tinto, V. (2005, December). Reflections on student retention and persistence: Moving to a theory of institutional action on behalf of student success. Studies in Learning, Evaluation, Innovation and Development, 2(3), 89-97. Retrieved January 25, 2008, from http://sleid.cqu.edu.au/viewissue.php?id=8. Troff, Deanne (2004). An explicit instruction design approach for teaching students with learning disabilities to solve mathematical problems involving proportions. M.S. dissertation, Utah State University, United States -- Utah. Retrieved November 10, 2009, from Dissertations & Theses: Full Text. (Publication No. AAT 1422331). 106

Turning Technologies. (2008). Retrieved from website at www.turningtechnologies.com United States Department of Education (USED). (2008). Foundation for success: The national mathematics advisory panel final report. Retrieved November 2, 2008 from www.ed.gov/mathpanel Vygotsky (1930). Primative man and his behavior. Cleveland: Harvester Wheatshelf. West,J. ( 2005). Learning outcomes related to the use of personal response systems in large science courses. Retrieved October 25, 2008 from http://www/academiccommons.org / commons/review/west-polling-technology Williams, T. R., & Butterfield, E. C. (1992). Advance organizers: A review of the research, Part 1. Journal of Technical Writing and Communication, 22, 259-272. Wit, E. (2003). Who want to be…The use of a personal response system in statistics teaching. MSOR Connections, 3, 5-11 Wood, W. B.(2004). Clickers: A teaching gimmick that works. Developmental Cell, 7, 796-798. Woodward, J., & Montague, M. (2002). Meeting the challenge of mathematics reform for students with LD. The Journal of Special Education, 36(2), 89 – 101. Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory into Practice, 41, 64.

107

APPENDIX A. COURSE OUTLINE/SYLLABUS SYLLABUS/OUTLINE INSTRUCTOR INFORMATION: List instructor information here.

TEXT and MATERIALS: Pre-algebra, Bittinger, Marvin L. and Ellenbogen, David, and Johnson, Barbara L. 5th edition, Addison Wesley Publisher. This course requires Course Compass/ My Math Lab. It is included in the purchase price of a new text book from the bookstore. If you do not have an access code, you can purchase one online at http://coursecompass.com for approximately $57. The course ID for this course is: instructorsname##### Student Solution Manual, optional Calculator: Scientific calculator COURSE DESCRIPTION FROM CATALOG: MATH 020 PreAlgebra (Cr3)(3:0) Review arithmetic operations on whole numbers, fractions, decimals, and integers. Introduces algebraic notation: solution of algebraic equations, inequalities, and applications. This course is intended to prepare students for MATH 022, Elementary Algebra.

LEARNING OUTCOMES FROM THE COURSE OUTLINE: Upon successful completion of this course, students will be able to: 1. a. b. c. d. e. f. g. h. i. j. k. l.

Know and understand the definitions of the following terms. Solution of an equation Exponent Absolute value Equivalent expressions Like terms Factor Prime number Reciprocal Least common denominator Ratio Proportion Percent

2. a.

Be able to perform the following operations Add subtract, multiply, and divide whole numbers.

108

b. c. d. e. f.

Add, subtract, multiply, and divide integers. Add, subtract, multiply, and divide numbers in fraction form. Add , subtract, multiply and divide mixed numbers. Add, subtract, multiply, and divide numbers in decimal form. Add, subtract multiple and divide negative numbers.

3. a. b. c. d. e. f. g. h. i. j.

Be able to perform the following procedures. Round numbers to a specified number of places. Estimate sum differences, products, and quotients using rounding. Evaluate algebraic expressions by cross multiplication. Evaluate exponential expressions. Convert between fraction notation and decimal notation. Covert between decimal to percent. Convert between whole number and percent. Simplify expressions using the rules for order of operations. Find the prime factorization of a composite number. Find the least common denominator of two or more numbers.

4. a. b. c.

Be able to solve linear equations using the following principles The Distributive Property. The Addition Principle of Equality. The Multiplication Principle of Equality.

5. Be able to use a calculator to a. Determine the square root of an exact square. b. Determine the approximate square root of a number c. Solve problems with square root terms. d. Be able to apply the Pythagorean Theorem to determine and unknow side of a right triangle. 6. a. b. c. d. e. f. g. h.

Be able to solve the following types of application problems. Those that can be modeled by linear equation. Those that can be modeled by the Pythagorean Theorem. Those that can be modeled by a proportion. Those which involve percent increase and decrease. Those that involve rate. Those which involve comparison of product prices. Those that involve interest. Those that involve a specific formula.

POLICIES: Class Attendance and Withdrawal: Online courses are designed to give you some flexibility in your ability to access course content, submit assignments, and interact with your instructor and fellow students. However, these courses are not self-paced. You are expected to fully participate in all class activities, and to submit all assignments by their due dates. Note that if you do not participate in the class, submit assignments, or contact the professor during a consecutive two-week period, you may be withdrawn from the class on the recommendation of the professor. However, do not assume that this will happen automatically. Unless you officially withdraw, you may owe money and receive an "F" as your final grade. Consequences of Late Work or Missed Exams: Assignment details and due dates can be found in the section: SCHEDULE OF ASSIGNMENTS. All chapter homework is due by the Chapter Quiz date. All makeup quizzes will be done after the final exam and during the final exam period.

109

Violations of Academic Honesty Policy: All forms of cheating and plagiarism are serious violations of the academic honesty policy. Depending on the severity of the offense, I will assess one of the following penalties:  A written warning, with the requirement that the assignment be redone within the specified time.  An “F” grade for the assignment or test.  An “F” grade for the course. See the Course Information page (Student Responsibilities folder) and the Student Handbook for more details, including steps for appealing charges. INSTRUCTIONAL PLAN: Assignments Required and Weight of Each in Determining Final Grade: 30% - Online Homework 50% - Chapter Quizzes 20% - Final Exam

OTHER RELEVANT PROCEDURES OR POLICIES: Disability Services: The Community College encourages academically qualified students with disabilities to take advantage of its educational programs. Services and accommodations are offered to students with disabilities at no additional cost to facilitate accessibility to College programs and facilities. Tutoring Services: The Learning Center provides free tutoring services, including real-time online tutoring. Please see the folder called Student Rights and Support Services in the Course Information page for details and applicable tutoring links. See the following website for the most up-to-date online tutoring schedule and information: COURSE CALENDAR AND SCHEDULE OF ASSIGNMENTS: Note: I reserve the right to change topics or assignments when necessary to make classes more relevant to current events or required student outcomes. Therefore, you should not submit assignments ahead of schedule unless you have obtained permission to do so. Check Announcements in Blackboard and the Assignments page for details and/or changes to assignments.

110

Date

Topics for Class Discussion

HW Due Date

Intro to Math Sections 1.1 – 1.3 Sections 1.4 – 1.6 Sections 1.7 - 1.9 Chapter 1 Quiz Sections 2.1 – 2.3 Sections 2.4 -2.6 Sections 2.7 – 2.8 Chapter 2 Quiz Sections 3.1 – 3.3 Sections 3.4 – 3.6 Sections 3.7 - 3.8 Chapter 3 Quiz Section 4.1- 4.4 Section 4.5 – 4.7 Chapter 4 Quiz Spring Break Sections 5.1 – 5.3 Sections 5.4 – 5.6 Section 5.7- 5.8 Chapter 5 Quiz Sections 7.1 - 7.3 Sections 7.4 – 7.5 Chapter 7 Quiz Sections 8.1 -8.2 Sections 8.3 – 8.4 Sections 8.5 - 8.7 Chapter 8 Quiz Section 9.1- 9.3 Sections 9.4 – 9.6 Section 10.1 Chapter 9 Quiz Final Review Final Exam on Scheduled day/time Note: This syllabus is a summary of important course information. For details, please view the contents of all folders in the Course Information page as well as the Assignments page in Blackboard. Also check the Announcements page for any changes to the syllabus.

111

APPENDIX B. LESSON PLANS Instructions for Experimental Group Instructor The enclosed lesson plans will give the daily schedule of topics for the semester. The individual lessons which require the Turning Point® software will be supplied on a disk. In addition to the examples presented in the lesson, students are required to complete the Course Compass homework. Instructors can make a copy of course “courseinstructor#####”, which contains all of the homework problems, from the Course Compass site for the students. It is critical that the instructor’s course ID be given to the students at the start of the semester. The scores for the pretest, quizzes and post test will be collected by the data recorder, machine scored and returned to the instructor. The instructor will forward the scored answer sheets to the principal investigator for only those students who have consented to be part of this study. The pretest or post test should not be returned to the students. Quizzes should be returned to the students and errors discussed.

112

Instructions for Control Group The control group should experience the same content as the experimental group. The only difference between the groups would be the use of the explicit questioning strategy with clickers. It is important to set up a Course Compass class for your students. The scores for the pretest, quizzes and post test and clicker evaluation will be collected by the data recorder, machine scored and returned to the instructor. The instructor will forward the scored answer sheets to the principal investigator for only those students who have consented to be part of this study. The pretest or post test should not be returned to the students. Quizzes should be returned to the students and errors discussed. Note: When using the PowerPoint® to present the lesson, you must open PowerPoint® first and then open the lesson. Clicking on the lesson directly will open it up in PowerPoint® Reader and much of the information will be missing.

113

Week One Class 1 A. Go over Course Syllabus B. Letter of Informed Consent – Please have all students sign. C. Introduction to Homework on Course Compass D. Material to be covered: (1.1 – 1.3) 1. Standard Notation 2. Addition a. Introduction to the Number Line b. Associative Law (a + b) + c = a + (b + c) c. Commutative Law a + b = b + a d. Perimeter 3. Subtraction a. Number line b. Subtracting whole numbers D. Remind students that on the next class there will be a diagnostic pre-test. This test has two purposes: 1. To determine if the student is misplaced in this course. 2. To identify the areas of weakness as a class and individually. Please ask students to bring a #2 pencil and they may use a calculator if they like for this pre-test. Students will be encouraged NOT to guess, but rather answer (E) Unable to solve for any problem which they do not know how to solve.

Week 1 Class 2 A. Lesson Sequence 1. Pretest: This test has two purposes: a.

To determine if the student is misplaced in this course.

b.

To identify the areas of weakness as a class and individually.

2.

Instructions: Please ask students to use a #2 pencil and they may use a calculator if they like for this pre-test. Students should enter their names and student ID number on the answer sheet. Students will be encouraged NOT to guess, but rather answer (E) Unable to solve for any problem which they do not know how to solve.

3.

Tell students to move through the pre-test as quickly as possible.

4.

Allow the entire class period for the pretest.

B. Assignment: Read section 1.4- 1.6 for next class. Pay special attention to multiplying and dividing numbers larger than 10. C. Collect the pretests along with the answer sheets and return to mailbox at campus.

114

Week 2 Class 1 Lesson Sequence (1.4 - 1.6) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Addition

ii) Perimeter iii) Subtraction iv) Standard notation v) Commutative law vi) Associative law D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Rounding

ii) Mathematical Symbols, >,< , = iii) Multiplication iv) Area v) Division by 1, zero vi) Division with a remainder E. Collect clickers F. In class practice: Page 55 , #39, 45, 47 G. Assignment: Read Section 1.7 – 1.9. Complete Course Compass homework. Save session on Turning Point®

Week 2 Class 2 Lesson Sequence( 1.7 – 1.9) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed.

115

i)

Rounding

ii)

Mathematical Symbols, >,< , =

iii)

Multiplication

iv)

Area

v)

Division by 1, zero

vi)

Division with a remainder

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Average

ii) Solving Equations iii) Cross Multiplication iv) Keywords, Phrases and Concepts v) Exponential notation vi) Order of Operations E. Collect clickers F. In class practice: Page 91 , #54, 55,56, 57 G. Assignment: Read Section 2.1-2.3. Complete Course Compass homework. Quiz next class on Chapter 1. Prepare for the quiz by doing the Chapter tests on Course Compass. Save session on Turning Point® .

Week 3 Class 1 Lesson Sequence (2.1 – 2.3) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Average

ii)

Solving Equations

iii)

Cross Multiplication

iv)

Keywords, Phrases and Concepts

v)

Exponential notation

116

vi)

Order of Operations

D. Material to be covered in lesson (approximately 15-20 minutes): Using Board and PP lesson i)

Number line

ii)

Rules for Addition of Negative Numbers

iii)

Rules for Subtraction of Negative Numbers

iv)

Simplifying expressions with negative numbers

v)

Absolute numbers

E. Collect clickers F. Quiz 1. Students may NOT use calculators. Students must write their answers, name and ID number on both the quiz and the answer sheet. #2 pencils required. G. Assignment: Read Section 2.4-2.6. Complete Course Compass homework. Save session on Turning Point® .

Week 3 Class 2 Lesson Sequence: Using Computer Resources A. No Turning Point® today. B. Students sign in C. Review Quiz 1 D. Demonstrate Course Compass, BlackBoard. If possible, schedule a computer room and help the students actually sign onto the programs. E. Invite a rep from the learning Center to speak to the class. Pass out “prescriptions.” F. In class practice: Complete Course Compass homework from Chapter 1. Locate videos, identify chapter tests, look at study plans. G. Begin introduction of negative numbers, if time permits. H. Assignment: Read Sections 2.4 - 2.6

Week 4

117

Class 1 Lesson Sequence (2.4 – 2.6) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Number line

ii)

Rules for Addition of Negative Numbers

iii)

Rules for Subtraction of Negative Numbers

iv)

Simplifying expressions with negative numbers

v)

Absolute numbers

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Rules for Multiplication of Negative Numbers

ii)

Rules for Division of Negative Numbers

iii)

Powers of integers

iv)

Working with equivalent expressions

v)

Using the distributive law

E. Collect clickers F. In class practice: Page 127 , #70, 74, 76 G. Assignment: Read Sections: 2.7-2.8. Complete Course Compass homework. Save session on Turning Point® .

Week 4 Class 2 Lesson Sequence (2.7 – 2.8) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Rules for Multiplication of Negative Numbers

ii)

Rules for Division of Negative Numbers

iii)

Powers of integers

118

iv)

Working with equivalent expressions

v)

Using the distributive law

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Combining Like Terms

ii)

Perimeter

iii)

Equivalent Equations

E. Collect clickers F. In class practice: Page 155 : # 32, 33, 34, 35, 36 , 41, 43 G. Assignment: Read Section 3.1 – 3.3. Complete Course Compass homework. Save session on Turning Point®

Week 5 Class 1 Lesson Sequence ( 3.1 – 3.3) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Combining Like Terms

ii)

Perimeter

iii)

Equivalent Equations

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Tests for divisibility

ii)

Prime numbers

iii)

Composite Numbers

iv)

Prime Factors and just factors

v)

Fraction notation

E. Collect clickers F. Quiz 2. Students may NOT use calculators. Students must write their answers, name and ID number on both the quiz and the answer sheet. #2 pencils required.

119

G. Assignment: Read Section 3.4 – 3.6. Complete Course Compass homework. Save session on Turning Point®.

Week 5 Class 2 Lesson Sequence (3.4 – 3.6) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Tests for divisibility

ii)

Prime numbers

iii)

Composite Numbers

iv)

Prime Factors and just factors

v)

Fraction notation

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Multiplication of fractions

ii)

Simplifying fractions

iii)

Equality tests with cross multiplication

iv)

Area of a triangle

v)

Word problems with fractions

E. Collect clickers F. In class practice: Page 205 : #30, 34, 38, 42; Page 208, 65, 67 G. Assignment: Read Section 3.7 – 3.8. Complete Course Compass homework. Save session on Turning Point® .

Week 6 Class 1 Lesson Sequence (3.7 – 3.8) A. Sign onto Turning Point® Lesson and pull up participant list.

120

B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Multiplication of fractions

ii)

Simplifying fractions

iii)

Equality tests with cross multiplication

iv)

Area of a triangle

v)

Word problems with fractions

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Reciprocals

ii)

Division of Fractions

iii)

Solving equations

iv)

Word Problems

E. Collect clickers F. In class practice: Page 225 : #45 - 54 G. Assignment: Read Section 4.1 – 4.4. Complete Course Compass homework. Save session on Turning Point® .

Week 6 Class 2 Lesson Sequence(4.1 – 4.4) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Reciprocals

ii)

Division of Fractions

iii)

Solving equations

iv)

Word Problems

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson

121

i)

Least Common Multiples

ii)

Prime factorization

iii)

Addition of fractions

iv)

Order of fractions > < or =

v)

Subtraction of fractions

E. Collect clickers F. Quiz 3. Students may NOT use calculators. Students must write their answers, name and ID number on both the quiz and the answer sheet. #2 pencils required. G. Assignment: Reread Section 4.1- 4.4 and read 4.5 – 4.7. Complete Course Compass homework. Save session on Turning Point®

Week 7 Class 1 Lesson Sequence ( 4.5 – 4.7) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Least Common Multiples

ii)

Prime factorization

iii)

Addition of fractions

iv)

Order of fractions > < or =

v)

Subtraction of fractions

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Mixed numbers

ii)

Addition of Mixed numbers

iii)

Subtraction of mixed numbers

iv)

Negative mixed numbers

v)

Multiplication of mixed numbers

vi)

Division of mixed numbers

E. Collect clickers

122

F. In class practice: Page 294 : #6, 10, 18, 22 G. Assignment: Bring into class your questions about fractions. Complete Course Compass homework. Save session on Turning Point® .

Week 7 Class 2 Lesson Sequence ( 4.4, 4.6 and 4.7) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Mixed numbers

ii)

Addition of Mixed numbers

iii)

Subtraction of mixed numbers

iv)

Negative mixed numbers

v)

Multiplication of mixed numbers

vi)

Division of mixed numbers

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Practice session: Application with fractions Applications with mixed numbers

E. Collect clickers F. In class practice: Page 300: 14-18, Page 301: 36-41 G. Assignment: Complete Course Compass homework. Save session on Turning Point® .

Week 8 Class 1 Lesson Sequence (Intro to decimals) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson

123

i)

Fractions to decimals

ii)

Which is larger?

iii)

Adding decimals

iv)

Subtracting decimals

D. Collect clickers E. Quiz 4. Students may NOT use calculators. Students must write their answers, name and ID number on both the quiz and the answer sheet. #2 pencils required. Assignment: Read Section 5.1-5.3. Complete Course Compass homework

Week 8 Class 2 Lesson Sequence (5.1-5.3) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Fractions to decimals

ii)

Which is larger?

iii)

Adding decimals

iv)

Subtracting decimals

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Multiplication of decimals

ii)

Rounding decimals

iii)

Cents to dollars

iv)

Evaluating decimals

v)

Problems with decimals

E. Collect clickers F. In class practice: Page 335: #57-61 G. Assignment: Read Section 5.4 – 5.6. Complete Course Compass homework. Save session on Turning Point® .

124

Week 9 Class 1 Lesson Sequence ( 5.4 – 5.6) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Multiplication of decimals

ii)

Rounding decimals

iii)

Cents to dollars

iv)

Evaluating decimals

v)

Problems with decimals

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Division of Decimals

ii)

Common decimal equivalents

iii)

Estimating

iv)

Solving Equations

E. Collect clickers F. In class practice: Page 360 : # 9-17 G. Assignment: Read Section 5.7 – 5.8. Complete Course Compass homework. Save session on Turning Point®

Week 9 Class 2 Lesson Sequence ( 5.7 – 5.8) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Division of Decimals

125

ii)

Common decimal equivalents

iii)

Estimating

iv)

Solving Equations

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Equations with one variable

ii)

Equations with two variables

E. Collect clickers F. In class practice: Page 387: #41, 42, 43 G. Assignment: Read Section 7.1 – 7.3. Complete Course Compass homework. Save session on Turning Point®

Week 10 Class 1 Lesson Sequence ( 7.1 – 7.3) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Equations with one variable

ii)

Equations with two variables

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Ratios

ii)

Unit Price

iii)

Proportions

E. Collect clickers F. Quiz 5. Students may use calculators. Students must write their answers, name and ID number on both the quiz and the answer sheet. #2 pencils required. G. Assignment: Read Section 7.4 – 7.4 Complete Course Compass homework. Save session on Turning Point® .

126

Class 2 Lesson Sequence (7.4 – 7.5) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Ratios

ii)

Unit Price

iii)

Proportions

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Geometric Applications

ii)

Other Shapes in proportion

iii)

Equivalent Equations

E. Collect clickers F. In class practice: Page 508 : #21 - 26 G. Assignment: Read Section 8.1 – 8.2. Complete Course Compass homework. Save session on Turning Point® .

Week 11 Class 1 Lesson Sequence ( 8.1 – 8.2) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Geometric Applications

ii)

Other Shapes in proportion

iii)

Equivalent Equations

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Percent notation

127

ii)

Decimal to percent

iii)

Fraction to percent

iv)

Less than one percent

v)

Percentage problems

E. Collect clickers F. Quiz 6. Students may use calculators. Students must write their answers, name and ID number on both the quiz and the answer sheet. #2 pencils required. G. Assignment: Read Section 8.3 – 8.4. Complete Course Compass homework. H. Save session on Turning Point® .

Week 11 Class 2 Lesson Sequence ( 8.3 – 8.4) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Percent notation

ii)

Decimal to percent

iii)

Fraction to percent

iv)

Less than one percent

i)

Percentage problems

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Translate to a proportion

ii)

Application of percent

iii)

Percent increase and decrease

E. Collect clickers F. In class practice: Page 579 : #3-14 G. Assignment: Read Section 8.5 – 8.7. Complete Course Compass homework. Save session on Turning Point®

128

Week 12 Class 1 Lesson Sequence ( 8.5 – 8.7) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Translate to a proportion

ii)

Application of percent

iii)

Percent increase and decrease

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Sales tax

ii)

Commission

iii)

Discount

iv)

Simple interest

v)

Compound interest

vi)

Interest on credit cards

vii)

Interest on loans

E. Collect clickers F. In class practice: Page 581: #12 - 18 G. Assignment: Read Section 9.1 -9.3. Complete Course Compass homework. Save session on Turning Point®

Week 12 Class 2 Lesson Sequence ( 9.1 – 9.3) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed.

129

i)

Rules for Multiplication of Negative Numbers

ii)

Rules for Division of Negative Numbers

iii)

Powers of integers

iv)

Working with equivalent expressions

v)

Using the distributive law

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Metric system

ii)

Metric units

iii)

Area of a parallelogram

iv)

Area of a trapezoid

v)

Circles measurements

vi)

Circle area

vii)

Roller rink (616, #58)

E. Collect clickers F. In class practice: Page 155 : #30, 31, 32, 33, 34, 35, 36 G. Quiz 7. Students may use calculators. Students must write their answers, name and ID number on both the quiz and the answer sheet. #2 pencils required. H. Assignment: Read Section 9.4 – 9.6 Complete Course Compass homework. Save session on Turning Point® . Week 13 Class 1 Lesson Sequence ( 9.4 – 9.6) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Metric system

ii)

Metric units

iii)

Area of a parallelogram

iv)

Area of a trapezoid

v)

Circles measurements

vi)

Circle area

vii)

Roller rink (616, #58)

130

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Volume

ii)

Angles

iii)

Angle types

iv)

Triangles types

v)

Square root

vi)

Pythagorean Theorem

vii)

Applications

E. Collect clickers F. In class practice: Page 676: #17, 20-25 G. Assignment: Read Section 10.1. Complete Course Compass homework. Save session on Turning Point®

Week 13 Class 2 Lesson Sequence ( 10.1) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Volume

ii)

Angles

iii)

Angle types

iv)

Triangles types

v)

Square root

vi)

Pythagorean Theorem

vii)

Applications

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Addition of polynomials

ii)

Subtraction of polynomials

E. Collect clickers F. In class practice: Page 687 : evens # 12-20

131

G. Quiz 8. Students may use calculators. Students must write their answers, name and ID number on both the quiz and the answer sheet. #2 pencils required. H. Assignment: Read Section 10.2. Complete Course Compass homework. Save session on Turning Point®

Week 14 Class 1 Lesson Sequence (10.2) A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Addition of polynomials

ii)

Subtraction of polynomials

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Multiplying monomials

ii)

Product rule for exponents

iii)

FOIL

E. Collect clickers F. In class practice: Page 701: #1-16 G. Assignment: Complete ALL Course Compass homework. Save session on Turning Point® .

Week 14 Class 2 Lesson Sequence – Solving word problems A. Sign onto Turning Point® Lesson and pull up participant list. B. Students sign in and pick up their assigned clicker. C. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Multiplying monomials

132

ii)

Product rule for exponents

iii)

FOIL

D. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Writing Scientific notation

ii)

Negative exponents

iii)

Solving word problems

E. Collect clickers A. In class practice: Page 723 : #16 – 20 Assignment: Complete All Course Compass homework by last class. Save session on Turning Point® .

Week 15 Class 1 Lesson Sequence ( Working with polynomials) B. Sign onto Turning Point® Lesson and pull up participant list. C. Students sign in and pick up their assigned clicker. D. Rapid Fire Clicker review (approximately 20-25 minutes). Control group reviews questions from the following areas as needed. i)

Rules for Multiplication of Negative Numbers

ii)

Rules for Division of Negative Numbers

iii)

Powers of integers

iv)

Working with equivalent expressions

v)

Using the distributive law

E. Material to be covered in lesson (approximately 20-25 minutes): Using Board and PP lesson i)

Combining Like Terms

ii)

Perimeter

iii)

Equivalent Equations

F. Collect clickers G.

Give the Clicker questionnaire to experimental classes.

H. Assignment: Complete All Course Compass homework by last class.

133

Save session on Turning Point® .

Week 15 Class 2 Lesson Sequence – Review for final exam based on students’ questions. Review Session 1.

Instructions: Please ask students to use a #2 pencil and they may use a calculator if they like for this pre-test. Students should enter their names and student ID number on the answer sheet.

2.

Administer the posttest.. Students may leave when done.

134

APPENDIX C. PRE/POST TEST MATH 020 – Pre-Algebra

The exam begins on the next page.

135

PRE/POST TEST: There questions were taken from the Pearson databank for: Pre-algebra, Bittinger, Marvin L. and Ellenbogen, David, and Johnson, Barbara L. 5th edition, Addison Wesley. MULTIPLE CHOICE.  Choose the one alternative that best completes the statement or answers  the question. Use E if unable to solve. Add. Use (E) for Unable to Solve. 1)  

1)  _______

    A)  12,174   B)  12,264       Subtract. Use (E) for Unable to Solve. 2)        959       A)  

C)  13,274    

D)  12,274  

2)  _______ 

  595

B)  1223  

C)  695

D)  687    

 Multiply. Use (E) for Unable to Solve. 3)   

3)  _______ 

     A)  1,198,280   B)  1,199,280   C)  1,199,380     Find the area of the region. Use (E) for Unable to Solve. 4)       

D)  1,209,280 

4)  _______ 

 

 248 ft   A)  15,376 sq ft   C)  15,386 sq ft  

B)  15,366 sq ft  D)  620 sq ft 

  Divide. Use (E) for Unable to Solve. 5)   

5)  _______ 

   A)  183  

B)  183 R 25   C)  186 R 5     Solve the following. Use (E) for Unable to Solve. 6)      x + 224 =  790   A)  176,960   B)  566   C)  576            

136

D)  186 R 34 

6)  _______  D)  1014 

    7)  Davidʹs company has to ship 3850 boxes of sprinklers. If a truck can  hold 550 boxes, how many truckloads does he need to ship all the  boxes?   A)  6 truckloads    B)  7 truckloads   C)  5 truckloads   D)  8 truckloads    

7)  _______ 

  Find the average. Use (E) for Unable to Solve. 8)  Monthly checking account fees:     $ 17, $ 13, $ 7, $ 11, $ 5, $ 3, $ 7  A)  $7  B)  $8  C)  $11   D)  $9   Add. Use (E) for Unable to Solve. 9)     -25 + 31   D)  -56  A)  56   B)  6   C)  -6     Subtract. Use (E) for Unable to Solve. 10)   -16 - ( -17)   A)  -1   B)  1   C)  -33   D)  -16    Multiply. Use (E) for Unable to Solve. 11)    12 · (-8)   A)  4   B)  -4   C)  96   D)  -96    Divide. Use (E) for Unable to Solve. 12)   -180 ÷ 9   A)  30   B)  -30   C)  -20   D)  20    Simplify. Use (E) for Unable to Solve. 13)    4[ -3  + 4( -8  + 3)]   A)  -20   B)  -24   C)  -32   D)  -92    Solve the following problems. Use (E) for Unable to Solve. 14)  Logan sold 5 shares of stock for $31.60 each. What was the total amount  of the sale?   A)  $158.00   B)  $158.11   C)  $157.9   D)  $158.1      15)  Jeff borrowed $ 1070 from his brother. Jeffʹs sister wants 12 monthly  payments of $ 100 to repay the loan. How much extra is Jeffʹs sister  charging for the loan?   A)  $ 1200   B)  $ 130   C)  $ 1300   D)  $ 30       

137

8)   ______ 

9)  _______ 

10)  _______ 

11)  _______ 

12)  _______ 

13)  _______ 

14)  _______ 

15)  _______ 

Simplify. Use (E) for Unable to Solve. 16)             A)  25   B)  -7  

16)  _______  C)  1  

D)  5 

Find all the factors of the number.  Use (E) for Unable to Solve. 17)       56   A)  1, 2, 4, 7, 8, 14, 18, 28, 56   B)  1, 2, 4, 7, 8, 14, 28, 56  C)  2, 4, 7, 8, 14, 28   D)  1, 2, 3, 4, 7, 8, 14, 18, 28, 56   

17)  _______ 

  Find the prime factorization of the number. Use (E) for Unable to Solve. 18)         259   A)  7 · 35   B)  7 · 37   C)  7 · 7 D)  7 · 7 · 37    Multiply. Use (E) for Unable to Solve. 19)    

18)  _______ 

19)  _______ 

    A)  

B)  

C)  

D)  

           Divide and simplify. Use (E) for Unable to Solve. 20)           A)  

B)  

 

20)  _______ 

C)  

D)  

             Solve. Use (E) for Unable to Solve. 21)  Deborahʹs water bottle can hold 5/7 L. When he starts on his bicycle  race, his water bottle is 2/3 full. How many liters of water does she  have?     A)   B)   C)   D)      

  

  

21)  _______ 

   

       Find another name for the given number, but with the denominator indicated. Use (E) for  Unable to Solve. 22)    22)  _______          A)  

B)     

C)     

D)     

 

138

    Simplify. Use (E) for Unable to Solve. 23)     

23)  _______ 

     A)  7  

B)  

C)    

D)    

      Solve. Use (E) for Unable to Solve.  24)

24)  ______ 

7 p  105 4   therefore, p =  A) 19   

B) 105    

C) 60    

D) 184 

  Find the least common multiple of the set of numbers. Use (E) for Unable to Solve. 25)  _______  25)    9,  12     A)  36   B)  12   C)  21   D)  108      Add and simplify. Use (E) for Unable to Solve. 26)   

26)  _______ 

    A)  

B)     

     27)          

C)     

D)     

  27)  _______ 

  A)  

B)  

C)  

D)  

           Subtract and simplify. Use (E) for Unable to Solve. 28)     

 

28) __ __ _

    A)  

B)      

C)      

D)      

 

139

  Solve and simplify. Use (E) for Unable to Solve. 29)      1 3 1 3 p    p   

8

4

  Therefore,  p =  A)  

8

B)  

29)  _______ 

4

C)  

D)  

             Solve. Use (E) for Unable to Solve. 30)  Zory has 20/36 yards of canvas from which she is cutting strips. She has  cut 12/36 yards already. How many yards of the canvas are left?   A)   B)   C)   D)                    Subtract. Write a mixed numeral for the answer. Use (E) for Unable to Solve.                                  31)                                                                                                  2 A) 13 2 21   13  

3   7 4 C) 12    7 4 D) 13 7

B) 13

30)  _______ 

      31) _________

21

    Divide. Write a mixed numeral for the answer. Use (E) for Unable to Solve. 32)    32)  _______          

      Round to the nearest tenth. Use (E) for Unable to Solve. 33)   8.942   A)  8.9   B)  8.94   C)  8.8            

140

33)  _______  D)  9.0 

  Subtract. Use (E) for Unable to Solve. 34)  

34)  _______ 

     A)  93.23  

B)  92.73  

C)  100.77  

D)  93.33 

       Solve. Use (E) for Unable to Solve. 35)      17.461 + m =  29.009 ,  therefore m =       A)  11.541   B)  9.578   C)  11.548    

35)  _______  

D)  11.638 

    Multiply. Use (E) for Unable to Solve. 36)          

36)  _______ 

 

A)  79.2   B)  7.92     Divide. Use (E) for Unable to Solve. 37)       A)  6.9  

B)  7.1  

C)  792  

D)  0.792 

37)  _______  C)  6   

D)  7 

      Solve the problem. Use (E) for Unable to Solve. 38)  Patrick’s subtotal at Scrambleʹs Electronics is $ 12. 77. The sales tax on  these items is $1.69.  What was Patrickʹs total bill?   A)  $ 14. 46   B)  $ 14. 36   C)  $ 13. 46   D)  $ 13. 36    Solve. Use (E) for Unable to Solve. 39)    402.9 =  23.7 · x     A)  170   B)  17.0   C)  1.7   D)  18    Solve the problem. Use (E) for Unable to Solve. 40)  A rectangular garden measures 9.8 feet by 55.2 feet. What is its area in  square feet?   A)  5409.6  B)  108.19 C)  1081.9    D)  540.96      

141

38)  _______ 

39)  _______ 

40)  _______ 

  Find fractional notation for the ratio. You need not simplify.  Use (E) for Unable to    Solve. 41)  In a three-point shooting contest, Maria attempted 20 shots and made   11 of them.  What is the ratio of shots made to shots attempted?   A)   B)   C)   D)                  Find the rate as a ratio of distance to time. Use (E) for Unable to Solve. 42)    60 mi,  6 hr                          

 60

41)  _______ 

42)  _______ 

                

  Find the indicated rate. Use (E) for Unable to Solve. 43)  Lauren ran 1105 meters in  3.25 minutes.  What was her rate in meters  per minute?   A)  1105 meters/min    B)  350 meters/min  C)  3491.25 meters/min   D)  340 meters/min        Solve the problem. Use (E) for Unable to Solve. 44)  An 8-oz bottle of hair spray costs $ 3.70. Find the unit price in cents per  ounce.   A)  46.25 cents/oz   B)  29.6 cents/oz  C)  37 cents/oz   D)  0.4625 cents/oz   

43)  _______ 

44)  _______ 

    Solve. Give your answer as a mixed number if appropriate.   Use (E) for Unable to Solve. 45)   

45)  _______ 

   

 

               C) 45           

 

      Use a proportion to solve the problem. Use (E) for Unable to Solve. 46)  In the rectangles below, the ratio of length to width is the same. Find the  length of the larger rectangle. 

 

142

   46)  

_______  A)  13 ft   B)  14 ft   C)  15 ft   D)  16 ft    Determine which purchase has the lower unit price. Use (E) for Unable to Solve. 47)  Remember 1 lb. = 16 oz  47)  _______  Brand A:  29 oz for $ 8.99  Brand B:  2 lb,  8 oz for $ 11.60   A)  Brand A   B)  Brand B C)  Equal value   D)  Not enough information       Solve. Give your answer as a mixed number or fraction if appropriate.   Use (E) for Unable to Solve.    48) 48)________                                                     

 

          Find decimal notation. Use (E) for Unable to Solve. 49)   49)  _______    A  6425  B)  6.425   C)  0.6425   D)  0.06425    Solve. Use (E) for Unable to Solve. 50)    50)  _______    Enrollment in a business seminar increased from  55 people to  76  people.  What was the percent of increase?   A)  72.4%     B)  27.6%     C)  38.2%   D)  61.8%        Find percent notation for the number in decimal notation. Use (E) for Unable to Solve. 51)    51)  _______  0.00 4 of all math majors at a certain university double major in music.   A)  4%   B)  0. 4%   C)  0.0 4%   D)  4                  

143

      Find the simple interest. Round your answer to the nearest cent.   Use (E) for Unable to Solve.      52)  Principal = $ 2900  Interest Rate =  7.8% 

52)  _______ 

  A)  $ 188.50  

B)  $ 18.85  

C)  $ 226.20  

D)  $ 1.89 

    Find percent notation for the number in decimal notation. Use (E) for Unable to Solve. 53)    53)  _______  Sales this year were 5.2 times last yearʹs sales.   A)  5.2%   B)  520%   C)  52%   D)  0.0 52%        Solve the problem. Use (E) for Unable to Solve. 54)    In a clinical study, 20 of the 1000 subjects receiving a migraine  medication developed side effects.  What percentage developed side  effects?    A)  1%   B)  2%   C)  4%   D)  14%   

54)  _______ 

  Solve the problem. Use (E) for Unable to Solve. 55)    Mary borrows $ 7000 and agrees to pay it back in 7 years. If the simple  interest rate is 14%, find the total amount she pays back.   A)  $ 6860.00   B)  $ 13,860.00  C)  $ 68,600.00   D)  $ 75,600.00    Solve.  Use (E) for Unable to Solve. 56)  In 2007 the snowfall was 20 inches. Calculate the percent increase in   2008, if that year had a snowfall of 30 inches.                    

55)  _______ 

56)  ______ 

 

  Factor. 57)  20a – 6 =     A)  2(10a ‐ 3) 

57)  _______  B)  4(5a ‐ 2) 

C)  2(10a – 6) 

Simplify. Use (E) for Unable to Solve  

144

D)  5(4a ‐1)    

58)  _______ 

58)       A)  27   

 

B)  29  

C)  58  

D)  420.5

    59)       

59)  _______ 

289  16                 

A)  17.4  B)  21  C)  152.5  D)  174   Approximate to three decimal places. Use (E) for Unable to Solve. 60)       A)  24.494  

B)  24.496   C)  24.495                 (61))  Subtract and approximate to the tenths. 

60)  _______ 

D)  24.485 

61)  _______ 

57  12     A)  4.085 

B)  4.09  

C)  4.1 

D)  4.0

      Find the length of the third side of the right triangle. Use (E) for Unable to Solve. 62)    62)  _______ 

   A)  b =  20  km   C)  b =  18  km  

B)  b =  24  km  D)  b =  25  km        

145

    63)  

63)  _______  

     A)  c =  C)  c = 

B)  c =  14 cm 

 cm; c ≈  10.296 cm  

D)  c = 

 cm; c ≈  8.124 cm  

 cm; c ≈  7.483 cm 

  Solve the problem. Give an exact answer and an approximation to the nearest tenth. Use (E) for Unable  to Solve.                64                          A boat travels 3 mi south and then 9 mi east. How far is the boat from its starting point?    Where A = 3 mi and B = 9 mi   

    A)  12 mi 

27 mi   C)  36  mi  D)  90  mi 

B)  

 

 

 

 

 

 

 

 

146

 

64) _________

Formula Sheet Area of a rectangle = length x width Area of a circle = 3.14 x radius squared Area of a triangle = ½ x base x height Area of a Trapezoid = ½(a +b) x height Simple interest = Principal x rate x time

147

DO CLICKERS OPEN MINDS? USE OF A ...

Items 57 - 64 - Table 8. Independent Samples t test for Posttest Unit Quizzes. 73. Table 9. Tests of ... developmental math (Achieving the Dream Proposal, 2007). .... been integrated into science, business and some other areas higher education (Penuel, ...... the student responses which could be recorded and view in an Excel.

794KB Sizes 2 Downloads 121 Views

Recommend Documents

Open Hearts Open Minds Open Doors - Commack United Methodist ...
Training; Brazil. Youth Training and Human Rights;. Brazil ... the Confirmation Service on Sunday,. June 4 th ... puppets), Craft Helpers and Games. Leaders as ...

Open Hearts Open Minds Open Doors - Commack United Methodist ...
Oct 5, 2008 - career in the service of the Lord. I recall, gratefully, the warm welcome extended to the bride I brought with me in 1944, with whom I have spent ...

Open Hearts, Open Minds, Open Doors - Commack United Methodist ...
May 20, 2006 - Northport, a Northport Opera Company soprano sang three liturgical .... Apple Bank for Savings in Commack and assistant manager Nancy ...

Open Hearts Open Minds Open Doors - Commack United Methodist ...
Pastor's E-Mail: [email protected]. Rev. ... how many good works are left undone because they were never started. God has called you to his service, it is .... that free item for our food box. .... Everyone needs this list to live by…pass.

Open Hearts Open Minds Open Doors - Commack United Methodist ...
Dec 9, 2007 - Pastor's E-Mail: [email protected]. Rev. .... You are invited to add your old or favorite family .... http://www.iGive.com/html/refer.cfm?m.

Open Hearts Open Minds Open Doors - Commack United Methodist ...
Jan 1, 2006 - The theme for 2006 is “Preserving. Property; Serving People” and the ... participate! The wearing of your heritage/native dress will add to the.

Open Hearts Open Minds Open Doors - Commack United Methodist ...
it a priority, it all too often falls behind when my schedule is uneven or ... I claimed that a church where folks, all ... Lynda, we will share fellowship in an APPLE.

Open Hearts Open Minds Open Doors - Commack United Methodist ...
May 18, 2008 - Web Site: http://www.commack-umc.org. Emails: [email protected] ... I said yes, but all that I saw and heard was good. I .... Hosting. May 3 rd. Special LI East District Conference. 8:30 am-1:30 pm – Lay Speaker Awards.

Open Hearts Open Minds Open Doors - Commack United Methodist ...
Jun 16, 2007 - say good-by to the people called United. Methodists in Commack. It seems that the past nine years has flown by. It seems like yesterday that I began my ministry among you and now it is ending. As we move through the seasons of our live

Open Hearts Open Minds Open Doors - Commack United Methodist ...
Nov 22, 2008 - CELEBRATING 225 YEARS OF CHRISTIAN SERVICE IN COMMACK. 1783-2008. Rev. ... everyday, “Dear God, please help my mom to be a good pastor!” With this prayer .... computer detective work. THRIFT SHOPPE.

Open Hearts Open Minds Open Doors - Commack United Methodist ...
Web Site: http://www.commack-umc.org. Email: [email protected]. Pastor's E-Mail: [email protected]. Rev. Dr. John E. Carrington, Pastor (631) ...

Open Hearts Open Minds Open Doors - Commack United Methodist ...
Web Site: http://www.commack-umc.org ..... and drizzle to host a campfire where they had something most homeless ... One of our best selling items is Jewelry.

Open Hearts Open Minds Open Doors - Commack United Methodist ...
will present the program "How Are We. Raced", to ... now home and will be going to Florida for a couple of ... Island Ducks Game, another good time as usual.

Open Hearts Open Minds Open Doors - Commack United Methodist ...
Web Site: http://www.commack-umc.org ... may be difficult, even confusing in your own particular life. ... design or drawing that is a visual reminder of the time.

Open Hearts Open Minds Open Doors - Commack United Methodist ...
Oct 5, 2008 - Emails: [email protected] [email protected]. CELEBRATING 225 YEARS OF CHRISTIAN SERVICE IN COMMACK. 1783-2008. Rev. ..... Please send cards & good wishes to Phyllis at her new address: c/o ...

Open Hearts Open Minds Open Doors - Commack United Methodist ...
Nov 9, 2006 - Pastor's E-Mail: [email protected]. Rev. Richard C. ... memorial service for Judi in a church in. Columbia ... The list is long of all the things they gave;. Our veterans ... Our veterans—the very best on earth. By Joanna ...

Open Hearts Open Minds Open Doors - Commack United Methodist ...
Email: [email protected]. Pastor's E-Mail: [email protected]. Rev. Richard C. Mills, Pastor (631) 499-4770. November 2005. Open Hearts Open Minds Open Doors .... there are only nine Trustees and it would be impossible for us to do our job wit

Open Hearts Open Minds Open Doors - Commack United Methodist ...
Jun 16, 2007 - Pastor's E-Mail: [email protected]. Rev. Richard C. Mills, Pastor (631) 499-4770. June 2007. Open Hearts .... we will do our best to respond. We wish you all a happy and blessed summer. Anne Tammaro & Gail ... The UMW hosted a cof

Open Hearts Open Minds Open Doors - Commack United Methodist ...
all, I would like to express my deepest gratitude to ... Thank you all again for all your love and support! .... relish dish and assorted deserts, coffee tea and apple.

Open Hearts Open Minds Open Doors - Commack United Methodist ...
At 6:00pm, the 14 members of BSA. Troop 125 arrived, looking great in their uniforms. Joan and Tricia Nehlsen,. Harriet Neuberth, Karen Mallgraf, Tracy.

Open Hearts Open Minds Open Doors - Commack United Methodist ...
Emails: [email protected] ... CELEBRATING 225 YEARS OF CHRISTIAN SERVICE IN COMMACK. 1783-2008 ... God is good and it feels like a good gift from God to ... their birth months for this greetings list. ... Everything will be free.

Open Hearts Open Minds Open Doors - Commack - Commack United ...
Mar 18, 2006 - plumbing, landscaping; there are many items to be taken care of and a handful of people can't possibly accomplish it all. So please join in!!

Open Hearts Open Minds Open Doors - Commack United Methodist ...
Jun 17, 2015 - night sharing the theme for the evening. Stories of ... The theme for this event that is held every four .... Please add: honoring Donald Wright.

Open Hearts Open Minds Open Doors - Commack United Methodist ...
September 9, 1914-May 2, 1917. Formed in .... notebook? This is the ... 9. Church and have gone unnoticed! THANKS! Don and Heathers Goodbye! We miss ...