Imagine how devastating it would have been for the caveman that spent hours and hours perfecting fire by trial and error only to find out his friend from the cave next door had already done it and could have taught him. It was probably so devastating that that night’s mammoth probably didn’t even taste good. Imagine instead what if he had known about fire and instead could work on some fire related apparatus or a quicker way to make the fire. The mammoth would never have tasted better. The moral, although maybe deeply hidden in this analogy, is that it is always important to understand what other ideas are out there so any work done can help fill gaps or expand on previous innovations.


When examining what other solutions there are to our issue of checking for understanding and student feedback we can immediately see at least one prior solution. The genesis for this idea. Thumbs. Cheap, easy and battle tested.

Thumb Pros:

– $0

– No devices required

– Quick to execute. So quick

– Immediate


Thumb Cons:

– No anonymity. Students can be easily influenced by their peers thumb responses or be afraid of judgement

– Slow to quantify results.

– No lasting record.

– Black and white. Students cannot offer feedback on what facets are troubling them most.


Now, what about the ol’ clicker? Put a question up and potential answers on the board and students select a, b, c or d. Results are then collated for the teacher. Originally this concept literally did use clickers. Small remotes handed out to students that would then be used to answer the question, however there has been progression to allow students to log on to a provided specific internet address to answer the question on their own device.

Clicker Pros:

– Improved anonymity. Students can answer with less influence from peers

– Quick to collate and recorded for future teacher use

– Allows a greater level of feedback, more grey than black and white


Clicker Cons:

– Cost if clickers actually used. Clickers cost money and students are barely the most responsible people on earth. Losing and breaking are common occurrences.

– Slower. Need to put up a page. Students need to type address. Students must take time to answer. Needs to be predominately completed at once, rather than on going.

– Still only limited options for response. 1 topic question and 4 categories to answer from.


The most sophisticated idea found in searching was ‘GoSoapBox.’ A tool that students access on their personal devices and can used to deliver quizzes, to facilitate class discussion and help the teacher can feedback from the class on preferred activities. Included in this tool is a ‘confusion barometer,’ a function that allows student to move a barometer to show what their level of confusion is as the class progresses.

Confusion Barometer Pros:

– Anonymous

– Continuous. Students can respond through the lesson

– Part of a greater teaching tool, so already available

– Quick to implement

– Collated and stored results


Confusion Barometer Cons:

– Requires a device

– Not completely targeted to learning goals.


This final idea is quite a solid idea. A fire in our caveman analogy. So what could we add or what could be different to take this idea up a level. I would suggest that the target for development would be targeting the feedback mechanism to each of the classes’ specific learning objectives rather than a general feedback on progress and to add functions that allow students to report on assessment understanding even outside of class time. How this will be applied will be explored as we move towards designing our digital solution.


I would invite readers to suggest where they think development of these ideas could go, or are one of these solutions already the peak of student feedback and understanding checks.