Top 12 Best Practices for Clickers in the Classroom


I recently had the opportunity to write a whitepaper for DYMO/Mimio Interactive Teaching Technologies about use of hi-tech Student Response Systems (aka “clickers”) in the classroom.  In full disclosure, DYMO/Mimio, a Newell Rubbermaid company, is my former employer.

When I was working in the educational technology hardware industry one of the most common questions I was asked was some variation of “Can you show me research that says that if I put this equipment in my classroom my students’ outcomes will improve?”  This was a question that I got not just from customers, but from colleagues alike.  It was a fascinating and shocking question to me.  The idea that adding a chunk of metal and plastic into a classroom would suddenly, magically, transform academic performance was one that people so badly wanted to be true…an educational panacea…what better way to solve the education problem AND generate sales?

Of course, as a learning scientist, my career has been spent studying how to arrange instruction and the instructional environment to generate optimal learner outcomes.  And over the years I had yet to see a film projector, overhead projector, DVD player…and yes, even an interactive white board (IWB), laptop, tablet or student response system that, by its very addition to the classroom, produced meaningful change in learner performance.  My instructional design colleagues and I scratched our heads….it was a mystery to us that some didn’t recognize this hardware for what it was….a set of tools whose meaningful use was utterly dependent upon effective instructional design and curriculum.

When research about the positive impact of using Student Response Systems started appearing more frequently in the literature you can imagine the excitement in the industry.  But the understanding of why those effects were being produced…of which instructional practices were most appropriate for SRS devices to be used effectively…was still all but absent.  That challenge was the impetus for this whitepaper.  To say, “Yes, using devices in the classroom can impact learner outcomes….BUT, the manner in which the device is implemented instructionally is what determines the success.

I’ve excerpted from the whitepaper my top 12 best practice recommendations here, as well as the set of 65 references I culled from the literature.  Happy reading and please share your experiences using clickers in your own classroom!  What’s worked best for you??

Top 12 Best Practices for Clickers in the Classroom

1. Remember that the primary use of SRS should be for formative assessment. Increasing opportunities to evaluate student performance allows real-time adjustment of instruction.

2. Include only those questions that are pertinent to the targeted student learning outcomes; questions that are arbitrary or irrelevant should not be used.

3. Integrate questions throughout the lesson so that student understanding can be evaluated frequently and regularly. Leaving all questions until the end of the lesson does not allow for changing the instruction along the way.

4. Endeavor to write questions that target some of higher-level skills described by Bloom’s
Taxonomy (Pear et al., 2001). Multiple-choice questions are not restricted to low-level
skills, if written properly.

5. When working on new skill acquisition, include enough questions with novel examples to ensure that students are getting sufficient practice and generalization opportunities.

6. Be careful not to give away an answer through irrelevant cues, such as a pattern of correct answers or the wording of the question.

7. If you include items in which the student must identify the answer that does NOT
belong, write the word “NOT” in all capital letters and in bold, so that it is as clear
as possible.

8. Ensure that the correct answer is clearly the best one, but do use plausible distracters.
The point is not to trick the learners. The point is to use the questions to evaluate the
instruction the learners have received.

9. When using Vote-Discuss-ReVote methods in class, do not show graphs of student
response distribution following the first vote in order to avoid biased response shifting.

10. Make sure you review and analyze the data after the class is over. By examining
the patterns of what worked and what did not, you can improve the instruction for
next time!

11. If you want to increase attendance in your class, use the SRS daily.

12. Be willing to throw out or regrade questions that are unclear.

To download and read the entire whitepaper, please visit THE Journal: http://thejournal.com/whitepapers/2012/03/dymomimio_student-response-systems-improve-outcomes.aspx

References

1. Babcock, R.A., Sulzer-Azaroff, B., Sanderson, M., & Scibak, J. (1992).  Increasing nurses’ use of feedback to promote infection-control practices in a head-injury treatment center.  Journal of Applied Behavior Analysis, 25(3), 621-627.

2. Balajthy, E. (1984) Using student-constructed questions to encourage active reading, Journal of Reading, 27(5), 408–411.

3. Barnett, J. (2006).  Implementation of personal response units in very large lecture classes: Student perceptions.  Australasian Journal of Educational Technology, 22(4), 474-494.

4. Beatty, I.D. (2004). Transforming student learning with classroom communication systems.  Educause Center for Applied Research, Research Bulletin, 3, 1-13.

5. Beatty, I.D. & Gerace, W.J. (2009).  Technology-enhanced formative assessment: A research-based pedagogy for teaching science with classroom response technology.  Journal of Science Education Technology, 18, 146-162.

6. Bekker, M.J., Cumming, T.D., Osborne, N.K.P., Bruining, A.M., McClean, J.I., & Leland, L.S. (2010). Encouraging electricity savings in a university residential hall through a combination of feedback, visual prompts, and incentives.  Journal of Applied Behavior Analysis, 43(2), 327-331.

7. Boyer, E., Miltenberger, R.G., Batsche, C., & Fogel, V. (2009).  Video modeling by experts with video feedback to enhance gymnastics skills.  Journal of Applied Behavior Analysis, 42(4), 855-860.

8. Brobst, B. & Ward, P. (2002). Effects of public posting, goal setting, and oral feedback on the skills of female soccer players.  Journal of Applied Behavior Analysis, 35(3), 247-257.

9. Burnstein, R.A. & Lederman, L.M. (2001). Using wireless keypads in lecture classes.  The Physics Teacher, 39, 8-11.

10. Burnstein, R.A. & Lederman, L.M. (2003). Comparison of different commercial wireless keypad systems.  The Physics Teacher, 41, 272-275.

11. Caldwell J., Zelkowski J., Butler M. (2006). Using Personal Response Systems in the Classroom. WVU Technology Symposium; April 11, 2006; Morgantown, WV.

12. Caldwell, J.E. (2007).  Clickers in the large classroom: Current research and best-practice tips.  Life Sciences Education, 6(1), 9-20.

13. Cooke, N. L., Heron, T. E., & Heward, W. L. (1983). Peer tutoring: Implementing classwide programs in the primary grades. Columbus, OH: Special Press.

14. Cossairt, A., Hall, R.V., & Hopkins, B.L. (1973).  The effects of experimenter’s instructions, feedback, and praise on teacher praise and student attending behavior.  Journal of Applied Behavior Analysis, 6(1), 89-100.

15. Crouch, C.H. & Mazur, E. (2001).  Peer instruction: ten years of experience and results.  American Journal of Physics, 69(9), 970.

16. Cue N.  (1998). A Universal Learning Tool for Classrooms?. Proceedings of the “First Quality in Teaching and Learning Conference,”December 10–12, 1998; Hong Kong SAR, China.

17. Cutts, Q., Kennedy, G., Mitchell, C., & Draper, S. (2004).  Maximizing dialogue in lectures using group response systems.  Presented at 7th IASTED International Conference on Computer and Advanced Technology in Education, August 16-18, 2004, Hawaii  [accessed 30 January, 2012].  www.dcs.gla.ac.uk/∼quintin/papers/cate2004.pdf

18. d’Inverno, R., Davis, H., & White, S. (2003).  Using a personal response system for promoting student interaction.  Teaching Mathematics and its Applications, 22(4), 163-169.

19. Delquardi, J., Greenwood, C. R., Whorton, D., Carta, J. J., & Hall, R. V. (1986). Classwide peer tutoring. Exceptional Children, 52, 535-542.

20. Deslauriers, L., Schelew, E., & Wieman, C. (2011).  Improved learning in a large-enrollment physics class.  Science, 332, 862-864.

21. Draper, S.W., Cargill, J., & Cutts, Q. (2002). Electronically enhanced classroom Interaction.  Australian Journal of Education Technology, 18(1), 13-23.

22. Draper, S.W. & Brown, M.I. (2004).  Increasing interactivity in lectures using an electronic voting system.  Journal of Computer Assisted Learning, 20(2), 81-94.

23. Dufresne, R.J., Gerace, W.J., Mestre, J.P. & Leonard, W.J. (2000).  ASK-IT/A2L: assessing student knowledge with instructional technology (Tech. Rep. dufresne-2000ask).  University of Massachusetts Amherst Scientific Reasoning Research Institute.

24. English, D. (2003).  Audiences talk back: Response systems fill your meeting media with instant data.  AV Video Multimedia Producer, 25(12), 22-24.

25. Fagen, A.P., Crouch, C.H. & Mazur, E. (2002). Peer instruction: results from a range of classrooms.  The Physics Teacher 40(4), 206-207.

26. Fink, W. T., & Carnine, D. W. (1975). Control of arithmetic errors using informational feedback and graphing. Journal of Applied Behavior Analysis, 8, 461.

27. Hake R. R. (1998). Interactive-engagement versus traditional methods: a six-thousand student survey of mechanics test data for introductory physics courses.  American Journal of Physics, 66(1), 64–74.

28. Harris, V. W., Bushell, D., Jr., Sherman, J. A., & Kane, J. F. (1975). Instructions, feedback, praise, bonus payments, and teacher behavior. Journal of Applied Behavior Analysis, 8, 462.

29. Hestenes, D., Wells, M., & Swackhamer, G. (1992). Force concept inventory.  The Physics Teacher, 30(3), 141-158.

30. Heward, W. L., Courson, F. H., & Narayan, J. S. (1989). Using choral responding to increase active student response during group instruction. Teaching Exceptional Children, 21(3), 72-75.

31. Heward, W. L., Gardener, R., Cavanaugh, R. A., Courson, F. H., Grossi, T. A., & Barbetta, P. M. (1996, Winter) Everyone participates in this class.  Teaching Exceptional Children, 5-10.

32. Johnson, D., & McLeod, S. (2004). Get answers: Using student response systems to see students’ thinking. Learning & Leading With Technology, 32(3), 2-8.

33. Judson, E. & Sawada, D. (2002).  Learning from past and present:  Electronic response systems in college lecture halls.  Journal of Computers in Mathematics and Science Teaching, 21(2), 167-182.

34. Kline, C. S. (1986). Effects of guided notes on academic achievement of learning disabled high school students. Unpublished master’s thesis, The Ohio State University, Columbus.

35. Knight, J.K. & Wood, W.B. Teaching more by lecturing less. Cell Biology Education, 4, 298-310.

36. Kosiewicz, M.M., Hallahan, D.P., Lloyd, J. & Graves, A.W. (1982).  Effects of self-instruction and self-correction procedures on handwriting performance.  Learning Disability Quarterly, 5, 72-75.

37. Lane, D. & Atlas, R. (1996). The networked classroom. Paper presented at the 1996 meeting of Computers and Psychology, York, UK.

38. Lasry, N. (2008).  Clickers or flashcards: Is there really a difference?  The Physics Teacher, 46, 242-244.

39. Lovitt, T., Rudsit, J., Jenkins, J., Pious, C., & Benedetti, D. (1985). Two methods of adapting science material for learning disabled and regular seventh graders. Learning Disabilities Quarterly, 8, 275-285.

40. Martin, T.L., Pear, J.J., & Martin, G.L. (2002).  Feedback and its effectiveness in a computer-aided personalized system of instruction course.  Journal of Applied Behavior Analysis, 35, 427-430.

41. Mazur, E. (1996). Are science lectures a relic of the past? Physics World, 9, 13-14.

42. Mazur, E. (1997). Peer instruction: a user’s manual.  Prentice Hall: Upper Saddle River.

43. Mazur, E. (2009). Farewell, lecture? Science, 323, 50-51.

44. McDermott, L.C. & Redish, E.F. (1999).  Resource letter PER-1.  Physics Education Research American Journal of Physics, 67(9), 755-767.

45. Munro, D.W. & Stephenson, J. (2009) The effects of response cards on student and teacher behavior during vocabulary instruction. Journal of Applied Behavior Analysis, 42, 795-800.

46. Narayan, J.S., Heward, W.L., Gardner, R., & Courson, F.H. (1990).  Using response cards to increase student participation in an elementary classroom. Journal of Applied Behavior Analysis, 23(4), 483-490.

47. Nicol, D.J. & Boyle, J.T. (2003). Peer instruction versus class-wide discussion in large classes: a comparison of two interaction methods in the wired classroom.  Studies in Higher Education, 28 (4), 457-473.

48. Pear, J.J., Crone-Todd, D.E., Wirth, K., & Simister, H. (2001). Assessment of thinking levels in students’ answers. Academic Exchange Quarterly, 5 (4), 94-98.

49. Perez, K.E., Strauss, E.A., Downey, N., Galbraith, A., Jeanne, R., & Cooper, S. (2010). Does displaying the class results affect student discussion during peer instruction?  CBE Life Sciences Education, 9(2), 133-140.

50. Rantz W.G, Dickinson A.M, Sinclair G.A, Van Houten R. (2009). The effect of feedback on the accuracy of checklist completion during instrument flight training. Journal of Applied Behavior Analysis, 42, 497–509.

51. Rantz, W.G. & Van Houten, R. (2011).  A feedback intervention to increase digital and paper checklist performance in technically advanced aircraft simulation.  Journal of Applied Behavior Analysis, 44(1), 145-150.

52. Reichow, B. & Wolery, M. (2011). Comparison of progressive prompt delay with and without instructive feedback. Journal of Applied Behavior Analysis, 44, 327-340.

53. Roschelle, J., Abrahamson, L. A., & Penuel, W. R. (2004a). Integrating classroom network technology and learning theory to improve classroom science learning: A literature synthesis. Paper presented at the Annual Meeting of the American Educational Research Association, San Diego, CA.

54. Roschelle, J., Penuel, W. R., & Abrahamson, A. L. (2004b). Classroom response and communication systems: Research review and theory.  Paper presented at the Annual Meeting of the American Educational Research Association, San Diego, CA.

55. Seaver, W.B. & Patterson, A.H. (1976).  Decreasing fuel-oil consumption through feedback and social commendation.  Journal of Applied Behavior Analysis, 9(2), 147-152.

56. Sindelar, P. T., Bursuck, W. D., & Halle, J. W. (1986).  The effects of two variations of teacher questioning on student performance. Education and Treatment of Children, 9, 56-66.

57. Smith, S. L., & Ward, P. (2006). Behavioral interventions to improve performance in collegiate football. Journal of Applied Behavior Analysis, 39, 385–391.

58. Smith, M.K., Wood, W.B., Adams, W.K., Wieman, C., Knight, J.K., Guild, N. & Su, T.T. (2009). Why peer discussion improves student performance on in-class concept questions.  Science, 323, 122-124.

59. Stallard, C. K. (1982). Computers and education for exceptional children: Emerging applications. Exceptional Children, 49(2), 102-104.

60. Trap, J.J., Milner-Davis, P., Joseph, S., & Cooper, J.O. (1978). The effects of feedback and consequences on transitional cursive letter formation.  Journal of Applied Behavior Analysis, 11, 381-393.

61. Tudor, R.M. & Bostow, D.E. (1991). Computer-programmed instruction: The relation of required interaction to practical application.  Journal of Applied Behavior Analysis, 24(2), 361-368.

62. Van Houten, R., Morrison, E., Jarvis, R., & McDonald, M. (1974). The effects of explicit timing and feedback on compositional response rate in elementary school children. Journal of Applied Behavior Analysis, 7, 547-555.

63. Van Houten, R., & Thompson, C. (1976). The effects of explicit timing on math performance. Journal of Applied Behavior Analysis, 9, 227-230.

64. Wood, W.B. (2004).  Clickers: a teaching gimmick that works.  Developmental Cell, 7(6), 796-798.

65. Yang, F. M. (1988). Effects of guided lecture notes on sixth graders’ scores on daily science quizzes. Unpublished master’s thesis, The Ohio State University, Columbus.

About karen mahon

i am a behavior and learning scientist. i hold an ed.d. in educational psychology and am trained as an instructional designer. i have spent more than 15 years working in education and instructional software design.
This entry was posted in Technology and tagged , , , , , , , , . Bookmark the permalink.

12 Responses to Top 12 Best Practices for Clickers in the Classroom

  1. “Can you show me research that says that if I put this equipment in my classroom my students’ outcomes will improve?” This is the golden question, but an amazing one at that! Technology can either enhance or cripple a students learning. Thank you for this great article on response systems!

  2. This technology is a tool and is only as good as the individual using it. An ineffective teacher/facilitator will not “magically” become more effective just because he/she has a SRS (or other piece of equipment) in the classroom. I can go to the store and buy the most expensive circular saw and drill set. These don’t make me a master builder. If I am not a trained (and practiced) general contractor, then the tools are useless. It is because of this concept that initial and ON-GOING high-quality training and support must be part of any budget/purchase. If you are working with a salesperson or company that doesn’t have a proven track record of delivering high-quality training and professional development, I suggest that you look to another vendor.

  3. karen mahon says:

    Erin, interesting point and thanks for your input. I would focus less on the teacher being ineffective and more on the instructional methods being ineffective, I think. Teachers (and all of the rest of us, for that matter) are only as good as our training…which I think is what you are getting at. There is a distinction to be made, too, between knowing how to USE these devices versus knowing how to INTEGRATE them into effective instructional methods. I have not seen any hardware provider do a good job of the latter in professional development. Heck, I don’t even see vendors doing that with their own sales teams! I have walked onto booths on show floors where sales reps don’t even know what instructional objectives ARE, nevermind how to best use technology in support of attaining them.

    Training and PD that focuses on features of the devices and not benefits of the integration are a net loss for schools. And I don’t think we can expect customers, pre-purchase, to be able to evaluate a vendor’s ability to address the integration piece. I don’t blame schools for not wanting to purchase ongoing training if they don’t know what that training will actually DO for them. And I think until hardware vendors have true instructional design and curriculum experts on staff there is no reason to believe that the training will produce anything other than teachers who know how to turn devices on and off, and how to use features. And as you know, Erin, that generally results in product that sits in boxes.

    I have heard representatives of hardware vendors say things like, “It’s not our job to tell teachers how to teach.” I would argue that it IS the vendor’s responsibility to educate themselves about the best practices for the integration and then advise, encourage and lead the teachers in the implementation so that their overall instruction is improved. That is what true customer service is all about.

  4. Missy says:

    For classrooms that are equipped with iPads, there is an app called Socrative that is a free SRS. Someone demonstrated it at a meeting last month. Seems easy to use.

    • karen mahon says:

      Missy, yes, I’ve heard of that app too. I think it’s a great idea for schools with iPad implementations. One of the reasons I love iPad implementations is their versatility. Using a free or inexpensive app makes so much sense…small up-front investment and easy to upgrade when newer, better apps come along. A nice, flexible solution for the classroom.

      Anyone out there using these in classrooms? What about Android clicker apps?

  5. Pingback: Best of 2012 – These are a few of my favorite things! | disrupt learning!

  6. rachelkarel says:

    Reblogged this on rachelkarel and commented:
    Great guidelines for using clickers!

  7. octopusmommy says:

    Hi Karen. This blog is so interesting for me, as a teacher, researcher and like a new business woman, I decided to start a company here in Perú refered with technology aplied in education, because “clickers and response system” catched me, my husband one year ago bought this equipments to use them in a classroom, he’s engineer and teach about Risk Management in Projects, and i could see how versatil could be this devices and applicable to any type of education. I’m totally agree with your point: “There is a distinction to be made, too, between knowing how to USE these devices versus knowing how to INTEGRATE them into effective instructional methods” because that’s the reason wich I decided to get in in this new project, the main objective is not to sell a device, is offer the equipment and train teachers in the knowledge of software and devices, but also how to integrate their class and take advantage of the information it provides. Also the teacher could use this tool to get information about how his or her content is been assimilated by students and make decisions in a real time based on needs of their students, even reinforce if is necessary and integrate a new way of explanation of the content through of real situations and practical experience that make the education system so dynamic and adapted to the needs of each learning group.
    With respect the use about Socrative, (The institute where my husband teachs) they are starting to use this tool, and I think is a great idea, but the problem is how distractor could be a mobile for our students, maybe just receive an email or message in the middle of the class could interfere at the time of learning moment. And other disadvantage is the price difference between getting a response device with an ipad. There are different prices in the response devices and we also provide to customers different plans to offer them to students like a personal device for eachone or just pay into their fee the price of rent to equipments.
    I think the most important point is how advantage I could get with the use of theses new tool into a classroom and like a teacher, how much information i could get of my students, even I could give them just in a click a personal report wich the software can provide me. Others reason is how stimulate students could feel, when all of them are been heard and how organized I can be with my assessment system.
    I would like to do reference in my next page about your research.

Leave a comment