Higdon, Reyerson, McFadden & Mummey (2011). Twitter, Wordle, and ChimeIn as Student Response Pedagogies.

Higdon, J., Reyerson, K., McFadden, C., & Mummey, K. (2011). Twitter, Wordle, and ChimeIn as Student Response Pedagogies. EDUCAUSE Quarterly, 34(1). Retrieved from http://www.eric.ed.gov/ERICWebPortal/detail?accno=EJ925904

“… cloud-based student response systems [SRS] …” (¶ 1)

[Twitter …]

“… the short-form response would encourage an economy of thought, rather than lengthy rambles. Finally, the persistence of the tweets, and the Web 2.0 capabilities of RSS, made it a great candidate for allowing us to mine the data in post-class discussion and word clouds.” (¶ 5)

“Cloud computing, as we use it here, refers to applications and data that exist on the Internet and are generally accessible from a number of devices. The end users (in this case, faculty and students) don’t need to know where the data are stored — they simply have to remember the access point to retrieve the data.” (¶ 6)

“… backchannel — the ongoing, co-constructed, meta-content discussion that can accompany live demonstrations of nearly any type …” (¶ 7)

“… word clouds — visual representations of text that emphasize and deemphasize words, excluding articles and prepositions, by showing their relative size given the frequency with which they appear in the text.” (¶ 8)

“… we set up 10 course-based Twitter accounts for use in the class exercises. We ‘followed’ our 10 accounts to one another and then made the usernames and passwords available … [sign-in using] university ID … Finally, we set the class Twitter accounts so that only approved people could follow us, and we disallowed people finding us through Twitter search to avoid unwanted Twitter crowdsourcing of our course discussion and to encourage free discussion among learners.” (¶ 12)

“We saw persistent activity among the students on Twitter, but we also wanted to determine if that activity was thoughtful. This distinction has been articulated as two dimensions of motivation: persistence and mental effort. Mental effort is particularly challenging to measure, as it is a multifaceted construct affected by a number of intrinsic and extrinsic factors, including prior experience, cognitive load, and physiological responses to psychological stimuli10 such as stressors related to feelings of efficacy,11 attributions for success, and a host of other factors. … we decided to use proxy indicators based loosely on Salomon’s amount of invested mental effort (AIME) model.” (¶ 28)

“… a simple attendance/participation grade was provided to learners who participated in a simple check (tweets not considered thoughtful by the instructors), check plus (tweets considered thoughtful by the instructors), check minus (did not tweet) system.” (¶ 35)

“A clear and consistent, if slight, majority of the students did not seem to see the learning or motivational benefits of the Twitter exercise that the instructional and evaluation teams observed.” (¶ 39)

“… update dynamically as students are chiming. This allows the bar graph and pie chart or word cloud to update as new responses come in … In addition to dynamically generating the word cloud for text-based responses, individual words can be selected from the cloud, and the text-feed will auto-sort … .” (¶ 45)

“We have added the ability to simply right-click and exclude any word from the word cloud, allowing the cloud to redraw itself with the remaining, content-based, words.” (¶ 54)

“We are also exploring the option of allowing learners to add a related efficacy indicator to their responses, indicating how sure they are of the answer …” (¶ 55)

“… allowing learners to draw connections between and among items … as well as adding relative strength- of-connection indicators to those connections. … idea of clustering individual words in the cloud into categories and subcategories to see the relative strength of those categories in the cloud.” (¶ 57)

” We are certainly sympathetic to student complaints that they found the Twitter activity distracting. And yet — we observed more student engagement when the Twitter project was included.” (¶ 63)

“Another possible explanation is that students are simply unfamiliar with, uncomfortable with, and/or unconvinced that they should have to engage in this type of ‘culture of learning.'” (¶ 64)

“No pedagogy is perfect. Asking students to engage in any activity, by definition, pulls them away from other activities, some of which might be as or more productive for their learning than the ones the instructor has crafted. … Further, the somewhat opportunistic use of technology platforms not designed for these specific types of activities …” (¶ 65)

“… instructors should be aware that students might resist instructional shifts that ask them to become more engaged. Similarly, technology should be invoked in a thoughtful way to minimize learner distraction; a focus on ensuring that tools and activities serve student learning must be tantamount.” (¶ 68)

Selected references

  • Dunlap, J. C., & Lowenthal, P. R. (2009). Horton Hears a Tweet. EDUCAUSE Quarterly, 32(4). Retrieved from http://www.eric.ed.gov/ERICWebPortal/detail?accno=EJ877812
  • Oblinger, D. G., & Oblinger, J. L. (Eds.). (2005). Educating the net generation. EDUCAUSE e-Books. Retrieved from http://www.educause.edu/educatingthenetgen/5989
  • Are they students? Or “customers”? (2010). Retrieved September 10, 2011, from http://roomfordebate.blogs.nytimes.com/2010/01/03/are-they-students-or-customers/
See this page at https://kinasevych.ca/index.php