The Great Mobile Devices Debate: A False Binary (Part 3: Procedural Approaches for Structured Use of Technology )

Wright (2016) argues that technology-use articles published into 2016 had hitherto not addressed the varying contexts into which ICTs  are used. He suggests, for example, that the differences between large lecture halls and smaller classrooms as well as those among different content areas are significant, such that technology use policies and procedures may be non-transferable among these different settings. I concur with Wright in this capacity, and suggest that he even may not have gone far enough in recognizing the variance among contexts. That is, even two like-subject, like-size courses may have different goals, barriers, and needs related to ICTs (e.g., Internet capable smart phones, laptops, tablets).

However, where specifics are difficult to transfer across contexts, principles for technology use may lend themselves to a policy and procedure framework that may be utilized for concrete developments in various settings in higher education. To this point, I have identified and articulated supports for two interrelated principles: 

    1. The “extreme” positions of technology ban and unstructured technology use are both known to fail, across contexts. Therefore, 
    2. ICTs must be used intentionally and appropriately to maximize effectiveness.

What is left is to articulate how an instructor (or department, etc.) may design ICT use so as to apply the principles. In this section, I will address that important question. 

A Framework for Success

Two education design frameworks may be utilized to shape this discussion. Universal Design for Learning (UDL) and Understanding by Design (UbD; also known as “backward planning”). What these two frameworks have in common is a recognition of the importance of first identifying a learning outcome or objective, and using this clearly explicated outcome to shape all other pedagogical and learning material decisions (Meyer, Rose & Gordon, 2014; Wiggins & McTighe, 2005). A simple expression of the overlay between these two frameworks may be a progressive design flowing from clearly stated student outcomes to development of assessment methods that demonstrate those outcomes (not a focal point in this discussion) to the identification of barriers diverse students may face in achieving (or demonstrating) those outcomes, to the intentional selection of instructional methods and materials that will enable students to minimize barriers and maximize accomplishment (and demonstration) of the learning outcomes (see Figure 3). 

Backward Desig continuum from goals to assessments, barriers, methods & materials

Figure 3. Chronological design procedures using Universal Design for Learning and the Understand by Design frameworks.

 

When using this framework for the purpose of implementing ICTs well, it is worth noting that the barriers I am recommending instructors to address are those that would occur without ICTs, rather than those caused by ICTs. This framework is used to create the intentional use of ICTs that the literature shows mitigates problems caused by ICTs.  Thus, using the framework, we may simultaneously reduce negative ICT use and increase positive ICT use for the benefit of all. It is also worth noting that the analysis of goals, barriers, and strategies may result in decisions whereby non-technological solutions may be used as well or better than technology-based solutions. In those cases, intentionally not using technology for a given activity or learning experience may well be appropriate. Below, I break down the process.

Explicating objectives

According to Meyer and colleagues (2014), clear learning objectives “are the foundation of any effective curriculum” (p. 131). Among other things, research has shown that students are more likely to seek feedback and close the gap between their current understanding or skill and the target when they have clear, explicit learning objectives (e.g., Hattie & Timperly, 2007). Using the framework herein proposed, it is vital to begin by dwelling on the learning objectives before doing anything else. Note that for our purposes, objectives are not the same as broad course outcomes or discipline standards, though they are closely related inasmuch as instructor-created learning outcomes should explicitly bring students closer to course outcomes and mastery of discipline standards. 

An effective learning objective for a given lesson, unit, or class will be clear, explicit, and separate how one achieves the outcome from the outcome itself. For example, in a 300-level economics course, the objective:

“students will write an essay in which they identify the long-term effect of specific economic policies on global financial markets” 

is ineffective because essay writing is unrelated to what the instructor is actually hoping the students will achieve (which is analysis and evaluation). This is likely to cause confusion when the assessment doesn’t match the objective for which students prepared and may also hinder the instructor’s ability to make good decisions when it comes to materials and methods of instruction. Here is a more effective iteration: 

“students will analyze and evaluate long-term effects of specific economic policies on global financial markets.” 

As you may have noticed, the major differences between these two objective statements are found in the key verbs (bolded). In the ineffectively stated objective, the instructor entangled the learning objective with an unrelated skill (essay writing) which would not be taught in the course. Further, the verb “identify” is a lower-order skill (Adams, 2015), which may not be appropriate for the rigorous expectations of an upper-division economic course. In contrast, the more effective goal directs higher-order cognitive tasks for the students who will “analyze” and “evaluate” real world scenarios using content learned in the lesson, unit, or course. It does not, however, specify how students will do so, and thus leaves open the opportunity for creative expression of this analysis and evaluation, which may take the form of an essay, presentation, economic model demonstration, etc.; any of these may be appropriate for students to demonstrate their skill in analyzing and evaluating using the course material. 

Identifying barriers

Once clear, specific, disentangled objectives are identified, the next step would be to determine what kinds of barriers variable students may experience in achieving the stated objective. For example, in the economics class in which the above learning outcome may be couched, it is possible and probable that some students may yet lack fundamental economic vocabulary necessary to comprehend and compose economic texts (in whatever form). It is possible and probable that some students will find this material dry and disengaging, which makes learning far more difficult if not impossible (Meyer, et al., 2014). It is possible and probable that some students will find abstract application of economic concepts difficult and need cognitive supports until they master the skill. It is further likely that at some point, students with sensory impairments (long-term or temporary) will take the class and will struggle with traditional forms of information presentation (e.g., lecture, single-screen powerpoint display). 

One need not have a class roster or be given a letter from the office of disability services to recognize the possibility and/or probability of these kinds of barriers manifesting. For support in considering barriers, please see the “diversity profiles” document found in Appendix A. 

Identifying methods and materials

Here we come full circle to the use of ICTs. While it may seem that we have taken an unnecessarily long route to arrive at the use of ICTs, I argue that this intentional design path is the best way to achieve the structured utility of ICTs to not only reduce distraction, but maximize the effectiveness that using ICTs or any other tool, pedagogical method, or materials may have on supporting diverse student achievement of the learning goals. 

Continuing the example of the 300-level economics class, let’s consider that the goal is to help students develop the capacity to analyze and evaluate real-world economic scenarios. To accomplish this meaningfully, students may have to overcome disengagement, cognitive difficulty, and/or sensory impairments (e.g., visual impairments). Is it possible to intentionally use technology during the lecture in such a way as to address all three barriers and increase the quality of instruction toward the stated objective? Yes! One solution for this case may be the use of NearPod. 

NearPod enables instructors to create slide decks like PowerPoint. Students may connect to the presentation in realtime from their own devices to watch the slides advance as the instructor moves them (they can also be set such that students can move freely through the slides, if desired). Additionally, NearPod enables built-in opportunities for students to respond (and the instructor to collect real-time data), for example by answering multiple choice questions, responding to polls, or even drawing. The instructor may, in this case, use the instructional time to lecture about a concept using multiple means of representation (e.g., verbal explanations balanced with graphic/visual depictions of concepts) which may be viewed by students either on the main protection screen or their own devices. Students who follow along on their own devices do not need to see across the room, but can follow along right on their own screen. The use of opportunities for students to respond may both provide cognitive breaks and give the students a chance to practice with a concept just delivered. This would serve to reinforce understanding and would immediately highlight for the instructor if students were struggling to understand a concept such that he or she could re-explain the concept before pressing onward. 

After establishing a key concept through the use of lecture supported by the NearPod app, the instructor may ask students to put their devices away, forms small groups of four, and collaboratively seek to apply the concept to a given situation. This process of lecture, feedback, application could be repeated throughout the lesson to establish several key concepts in multiple ways for different students. In this scenario, the technology was used (and not used) intentionally and appropriately to minimize barriers and maximize access to the learning. 

Barriers and Solutions

To support instructors in applying the concepts in this report, I here turn to looking at several common barriers faced in higher education classrooms, including and especially large lecture halls, which are often among the most difficult to manage classrooms of all. Recall that the first step in the process is to identify goals, which I cannot reasonably do in this document which is intended for use by many. Therefore, the strategies are based solely on reducing/removing barriers, and should be taken, adapted, or left inasmuch as the solutions here provided also enhance access to the learning objective in applied context. 

Barrier: Students are often disengaged in large classes

Disengagement is among the most pervasive barriers to learning in large classrooms, lecture halls, and general education courses in general; however, there is evidence that instructional decisions can improve engagement in such settings, often through the structured use of ICTs (e.g., Apple & Nelson, 2002; Gill, 2011; Farmer-Dougan, 2011; Hoekstra, 2008). 

Addressing disengagement is sometimes difficult, as there are a multitude of definitions of engagement, ranging from affective engagement to social to behavioral engagement  (Kuh, 2009; Lawson & Lawson, 2013; Zepke, 2015). Please see Lawson and Lawson (2013) for a thorough review of these constructs. For the purpose of this report, I collapse the three types of engagement into one construct (engagement is affective, social, and behavioral) and attempt to answer the question: how can ICTs be used to improve engagement in the higher education classroom? 

To improve student engagement, instructors may choose to use ICTs to improve student’s opportunity to respond (OTR), something that is generally lacking the larger and more reliant on lecture a class is (e.g. Cashin 1985; Day 1980; Frederick 1999; Omelicheva & Avdeyeva, 2008; Renner 1993). Some of the strategies formerly discussed such as Twitter (p.10) and Cloud-based response systems (p.12) may be effective to this end. In the example played out for Twitter, notice a few themes for intentional technology use: 

  1. The instructor identified a barrier for which ICTs could be useful and matched an excellent, research-based tool to the barrier at hand. 
  2. The instructor did not assume that her students knew how to use the technology, certainly not in an educational capacity; she took the time to offer basic training.
  3. She made it routine. The tool was used regularly and intentionally in the course. What takes a lot of time to get going the first time can become much faster and smoother with routine. 
  4. She provided boundaries (ground rules, in this case). The boundaries were necessary and helped ensure a positive experience for all who opted in. 

Twitter Example

In a large history classroom, a professor realizes that that  many of the 100-level students are likely to feel that the content is irrelevant for their future majors (affective disengagement). They also feel faceless and nameless in the large hall, which both reduces their sense of responsibility and sense of personal value therein (social disengagement). As a result of these factors, they are prone to using their ICTs for game playing, working on homework for other classes, chatting with friends, etc. (behavioral disengagement). If left unstructured, it’s almost guaranteed that the presence of ICTs will become a distraction.

To prevent this from happening, the instructor decides to begin the semester by sharing a unique hashtag related to the course and term with her students and provides some basic ground rules for using the twitter feed during and outside of class (e.g., keep it on topic, no disparaging of individuals, no hateful rhetoric) and shared an example of a twitter feed that was used by another professor as an example. She gave students a handout with instructions for joining and using Twitter, for those who were not yet connected. Finally, she used the live Twitter feed to ask some sample questions to the students to warm them up to the idea, showing the live Twitter stream on the overhead projector for the benefit of students who did not opt in.

Over the course of the semester, the instructor frequently used Twitter intentionally, using it to garner questions before class, which helped shape the lecture, ask students to sound off on questions during class, and extend conversations about controversial and poignant events in history. To extend the tool further, still, in a report on historical figures, students are given the option to create a Twitter account role playing a historical figure (what would Sun Tzu or Abe Lincoln or FDR or Gandhi Tweet as key events in history, for which they were alive, unfolded?).

 

This same process could be followed if the instructor wished to use Socrative, Kahoot!, ClassKick or Pear Deck as tools to facilitate opportunities for students to respond in the large classroom. Each has its own strengths and limitations. All provide opportunity for students to respond. Students who have voice, who can express their ideas and be heard, who hear their names called out as the instructor draws from great ideas and questions, etc. are students who are more likely to engage affectively, socially, and behaviorally. This is a prime example of a situation where the technology provides opportunity that would not be possible in a low-tech or no-tech environment. 


Barrier: Instructor doesn’t know if the students are learning

In higher education, student assessment is often relegated to a small number of major assessments (e.g., projects, papers, reports), which assess cumulative learning to a point in time. Such assessments, broadly known as “summative” assessments, and are intended to be assessment of learning, as opposed to being “formative” assessments or assessments  for learning (Griffin, 2014). The distinction is important. Formative assessments take place during the learning process and are worth few or no points, but rather inform the instructor and students themselves about the degree to which students have mastered a given concept or skill (Sadler, 1989). Sometimes, unchecked, small misunderstandings at the beginning of a course result in compounding conceptual difficulties, which ultimately may lead to poor summative performance; had students been given feedback early, they may have achieved greater success. According to Black and Wiliam (2010), formative assessments are useful because they allow:

  • instructors to make adjustments to teaching and learning in response to assessment evidence;
  • students to receive feedback about their learning with advice on what they can do to improve; and 
  • students to participate in the process through self-assessment

Socrative Example

In a introduction to calculus lecture, the instructor recognizes that concepts in the lecture build upon one another. Failure to grasp foundational ideas will ultimately lead to increasing failure to understand the each succeeding advanced concept. If the class were smaller, the instructor would collect, grade, and provide substantive feedback on each key point across the term. As it is, such marking and feedback seems impossible.

To address this problem, the instructor decides to develop a routine of using Socrative in the classroom. To do so, the he takes 10 minutes to set up an account and class on Socrative and invites the students through to join through their school email addresses. Early in the course, the instructor walks the students through the use of Socrative, provides resources if they have questions or need to review how to use it (though, it’s very simple). As part of preparation for classes, the instructor reviews the main points in the lecture and thinks of how they could be converted into quick questions, saving a couple or few to the question banks so that they are ready to launch as soon as the instructor is ready. Sometimes, he also prepares an exit ticket that calls for the students to reflect on what they learned.

In class, the instructor makes a habit of having the students respond to a question or set of questions at appropriate times (usually every 20 minutes or whenever a point was established) as a way to gauge understanding. The visual data makes it very easy to see how well students have understood the concept. The instructor determines further routines for different situations. When the vast majority of students are correct, he offers brief explanation for the sake of those who missed it and goes on. When a more substantive number of students are wrong, he pauses to take questions and/ or explains the concept in another way before progressing. In this way, he nips misunderstandings in the bud and improves the outcomes for everyone.

In many ways, these benefits correspond well with the concept of mastery-oriented feedback (Dweck & Sorich, 1999); a construct embraced in the UDL framework. In brief, mastery-oriented feedback is that which “orients students toward mastery and that emphasizes the role of effort and practice rather than ‘intelligence’ or inherent ‘ability’” and it “is an important factor in guiding students toward successful long-term habits of mind” (National Center on UDL, para 1). A long history of research supports this practice to increase student outcomes, despite the fact that it is not often practiced in higher education (e.g., Bandura, 1986; Craven, Marsh, & Debus, 1991;  Deci & Moller, 2005; Lee & Lee, 2008; Schunk, 1983). 

With that being said, it is important to acknowledge that assigning formative assessment and interpreting results therefrom becomes increasingly cumbersome as class size increases. For example, requesting written homework response to demonstrate conceptual mastery of a point is feasible for a class of 10-20 students, but becomes unreasonable to mark and unwieldily to interpret for a class of 60, 80 or 100 students, especially on a regular basis. 

Here, again, ICTs may come to aide in the teaching and learning process and address a barrier without becoming a barrier. As an extension of the concept of opportunity to respond (and thus sharing some of the same tools), ICTs may be used for rapid formative assessment and applications may provide ways to instantaneously organize and process student feedback. This strategy may be approached using specialized tools other than ICTs (such as TurningPoint Clickers), but may also take advantage of student’s own devices. Tools such as Socrative, Kahoot!, NearPod, and Google Forms, for example, all include opportunity to collect student data by way of responses to polls, quick questions, or exit tickets in different formats (e.g., multiple choice, true/false, short answer) and styles (e.g., game-like, poll-like, quiz-like); additionally, all of these work on laptops, tablets, or smartphones with ease.  See the example use of Socrative on the next page, for an idea of how these tools may be effectively used. 

Notice that the instructor in this example used the same best practices that the instructor in the previous example (with Twitter) did: 

  1. The instructor identified a barrier for which ICTs could be useful and matched an excellent, research-based tool to the barrier at hand. 
  2. The instructor did not assume that his students knew how to use the technology, certainly not in an educational capacity; he took the time to offer basic training and offer resources for further support as needed.
  3. He made it routine. The tool was used regularly and intentionally in the course. What takes a lot of time to get going the first time can become much faster and smoother with routine.
  4. He provided boundaries (ground rules, in this case). The structure was necessary and helped ensure a positive experience for all who opted in.  (The tool is already structured such that additional boundaries are probably not necessary; further structure could be added later, if needed). 

Barrier: There are concerns about the effectiveness of using ICTs for note taking

Some recent studies have argued for print note taking being superior to note taking done on a laptop or tablet (e.g., Mueller & Oppenheimer, 2014). In Muller and Oppenheimer’s analysis, students who used laptops to take notes wrote significantly more content and had significantly more verbatim overlap than those who took longhand notes on paper. Students who took longhand notes ultimately outperformed their laptop note taking peers in factual and conceptual recall. 

However, this findings of this study have not been consistent in the literature. Murray (2011), for example, explicitly addresses this question in her study. In the context of classes in which students were explicitly instructed as to how they ought to take notes (regardless of whether by hand or on laptops), Murray found that the substantial majority (70.5%) of participants take notes that reflected the explicated best practices (e.g., listening, processing, recoding key ideas; Suritsky & Hughes, 1991). Likewise, Chiu, Wu, & Cheng (2013) found that prompting students to summarize content in their (shared) notes resulted in improvements in recorded facts and concepts and improved test scores compared to a questioning method in a similar setup. While this study does not compare digital to hand-written notes, it corroborates the hypothesis that the explicit structures provided and encouraged for note taking, and digital note taking specifically, does matter. 

Additionally, Thomson (2009) argued that there is not going to be a one-size-fits-all best practice for note taking. Some students who struggle to listen, process, and organize simultaneously, for example, may benefit from the generally discouraged verbatim-style notes, which they may review and organize later, outside of class. Murray (2011) suggests that suing ICTs may facilitate both options. 

Google Docs Example

To address this problem, the instructor decides to support the students in capturing content explored in class by facilitating a shared note taking experience. The students are already clustered in groups based on their future area of teaching (e.g., elementary, high school science, deaf ed), so she decides to take advantage of these groups for shared note taking. She begins by creating a Google document for each group, and shares the said document with the relevant students, giving them “editing” privileges. Early in the semester, the instructor provides the students with some of the research evidence for shared note taking, explicates what she expects of quality notes (e.g., not just verbatim recording, but synthesis, summary, questions, connections...) and provides an example.

Each class, she prompts the students to use the shared notes for their group. Occasionally, she pauses and has the students discuss a concept or question, recording their conclusions in their notes or otherwise provides a chance to capture information therein. Once per week (at first) and once per month (later) she reviews each group’s notes and provides feedback, participation points. If some students prefer to take notes by hand or otherwise do not wish to participate in the shared notes, alternatives provisions are made, but the majority of her students use and express approval and enjoyment of this collaborative method both now and to use with their own future students.

 

Taken together, these studies against and in favor of using ICTs for note taking reprise the debate over ICTs in general. Similarly, it appears that unstructured use of ICTs for note taking (i.e., providing no instructions or supports for note taking using digital technology) leads to the negative outcomes demonstrated by Mueller & Oppenheimer (2014), whereas even such small provision of structure as brief direct instruction led to what is generally perceived to be more effective note taking style, regardless of the digital medium. Thus, if the disadvantage can be neutralized with some intentionality, the question that remains is whether there are any distinct advantages of using ICTs for note taking.

Additional literature reveals significantly more possibility afforded by digital notes than those taken by hand. Using digital devices to take notes enables students to utilize multiple media (e.g., audio, video, images) in their notes (e.g., Anzai, Funada, & Akahori, 2013;

Barrs, 2011; Watfa, Rota, Wadhwani, & Kupessa, 2016), a practice which extends the concept of providing multiple means of representation (Meyer et al., 2014) from the original instruction to the students’ notes for review. Digital notes also allow students to easily rearrange, add information, insert graphics, and otherwise manipulate notes in a way that is far more restrictive when notes are taken with pencil and paper. 

Additionally, there is much written about the power and effectiveness of using an intentional collaborative (shared) note taking system (e.g., François Bry & Alexander Yong-Su Pohl, 2017; Kaminski et al., 2016; Miyake & Masukawa, 2013; Reyna, 2010; Valtonen, Havu-Nuutinen, Dillon, & Vesisenaho, 2011). Shared note taking occurs when students (in small groups or as a single large group) have access to a cloud-based document in which collaboration may occur in realtime (e.g., Google Docs, Evernote, Word Online).  For example, in Valtonen and colleague’s (2011) study, the authors found that  when students are able to collaborate in note taking, some may record verbatim content while others offer summary and main point highlight, and others still raise questions or make connections all in the same document. By thus collaborating, students are able to compile a far richer set of notes in far less time than would be otherwise possible.  See the example use of Google Docs on the next page for an idea of how shared note taking may be effectively used.

Notice once more that the instructor in this example used the same best practices that the instructors in the previous examples (Twitter, Socrative) did: 

  1. The instructor identified a barrier for which ICTs could be useful and matched an excellent, research-based tool to the barrier at hand. 
  2. The instructor did not assume that his students knew how to use the technology, certainly not in an educational capacity; she took the time to offer basic training and offer resources for further support as needed.
  3. She made it routine. The tool was used regularly and intentionally in the course. What takes a lot of time to get going the first time can become much faster and smoother with routine.
  4. She provided boundaries (basic instructions, feedback in this case). The structure was necessary and helped ensure a positive experience for all who opted in. 

The Intentional Intersection: Seamlessly Blending High and No-tech Experiences

In Part II of this report, I concluded that the most important aspect of successful ICT utility is intentional use (and non-use). Part III has thus far focused on intentional use of specific applications and tools available through ICTs. To conclude, however, I would like to demonstrate how use may be intersected with non-use in a seamless application. To do so, I will focus on the method of “peer instruction.”  

In what is now considered a classic and widely circulated article, Chickering and Gamson's (1987) “Seven Principles for Good Practice in Undergraduate Education” included two principles that explicitly stressed pedagogies of student collaboration: “cooperation among students,” and “active learning” (the latter of which is variably defined, but generally means that students must participate and actively construct information rather than passively receive instruction). In more recent years, these two constructs have been merged into the practice of peer instruction (e.g., Caldwell, 2007; Fagen, Crouch, & Mazur, 2002; Judson & Sawada, 2002; Meltzer & Manivannan, 2002). 

Peer instruction is a method whereby students assume the role of teachers to try to explain concepts to their peers and focuses attention on underlying concepts. By concluding this part of the report with a review of peer instruction, my purpose is to demonstrate the merger of intentional use and non-use of ICTs in concert. To be most effective, instructors utilizing this method will seamlessly move from data collection to social activity (generally with low or no tech). For example, Crouch (1998) applied this method in a large lecture hall to teach physics. After lecturing on an abstract concept for some time (perhaps 10 minutes), the instructor provided a concrete scenario that applied the concept being taught and asked the students to solve the simple problem, selecting the best response from among five multiple choice items. After students committed to an answer, they were told to turn to a peer and try to convince them as to why their own answers were correct (and to find a peer who disagrees if their neighbor was in agreement). After a few minutes, the instructor had the students re-submit their answer. In this anecdote, the instructor noted that while only 43% of the class was correct in the first iteration, 80% were correct after the brief peer instruction. Crouch speaks of using this method repeatedly to establish key concepts, improve engagement, and clue the instructor in on what students have and have not mastered, using this data to inform instructional decisions. Other studies have found similar positive outcomes of peer instruction such as that employed by Crouch (1998) in various settings (e.g., Fagen, Crouch, & Mazur, 2002).

Given the technological limitations of the time, Crouch’s approach to assay correct responses among the 150 students in the lecture hall was to use color-coded flash-cards, which students held up. As creative and innovative as this solution clearly was, there are significant limitations to this low-tech approach. For example, such an approach may provide rough estimations of student positions, but may be difficult to rapidly interpret when responses are varied and more difficult the larger the classroom becomes. Further, such an approach necessitates students to very publicly reveal their responses: something some students may feel uncomfortable doing in such a setting (especially if faculty were polling regarding sensitive issues). Finally, it would exceedingly difficult for the instructor to meaningfully tie responses (and therefore progress, trends) to individual students for record keeping. 

More recent studies (e.g., Brady, Seli, & Rosenthal, 2013; Kappers & Cutler, 2015; Mayer et al., 2009; Williams & Carvalho, 2015) have demonstrated how ICTs can bring peer instruction as a form of collaboration into the 21st century in such a way that enhances the original method and mitigates the challenges of data collection and interpretation that occurs with more analog approaches like that employed by Crouch (1998). Brady and colleagues, explicitly compared the use of low-tech response cards (cf. Crouch) to the use of digital devices and found that the enhanced flexibility and potential of digital devices (including the ability to quickly collect and display student responses) led to practically and statistically significant improved academic and engagement outcomes. However, the authors corroborate the findings of others (Crouch; Mayer et al., 2009) that the use of student responses in the large lecture hall is significantly enhanced by following up with strategies that facilitate metacognition and reflection (such as peer instruction or other forms of peer dialogue using the collected data). 

TurningPoint Example

In an large calculus lecture hall, required for all computer science, engineering, and physics majors, the instructor recognizes that the complexity of concepts requires significant cognitive activity from students above-and-beyond listening to a lecture and taking notes. They must actively apply abstract concepts to practical applications. While it’s easy enough to assign such work as homework, the delay between when students respond and when they receive feedback severely restricts how much they learn from these experiences. To address this barrier to learning, the instructor decides to apply the Peer Instruction method in class. Having anticipated using this approach, the instructor took a full class early in the term to teach and practice using the routine.

In practice, the instructor partially “flips” the class (moving the lecture content mostly outside of class and the hands-on practice usually done as homework mostly into classtime). During class, the instructor extends lecture (which was delivered as a podcast) and pauses at intervals to informally quiz students using practical applications of the concepts. He chooses to use Turningpoint, given its flexible features for types of student responses and data representation, building questions (and subsequent result displays) directly into his PowerPoint. When there is disagreement and/or number of incorrect responses, the instructor begins the peer instruction routine.

He has students turn to “shoulder buddies” or small groups of 3-4 with at least one member who disagrees with at least one other member of the group and provides five minutes for discussion and debate regarding which is the correct answer and why. After the five minutes, a second opportunity to share the correct answer is made via Turningpoint; data is presented as a juxtaposition of “before” and “after” responses. Depending on the outcome, the instructor may add clarity, ask for justification from those who answered with (“A” or “C”), directing to the class Twitter feed, and address misunderstandings that remain. The learning becomes far more dialogic, despite the venue.

 

Numerous web, computer, and/or mobile applications could be used to facilitate the data collection and representation so positively explored in the literature as the first step of peer instruction (see: “Barrier: Instructor doesn’t know if the students are learning”), including Turningpoint (which can be fully implemented with ICTs rather than clickers, if desired). After discussion in pairs or small groups, instructors may also wish to facilitate a qualitative “sound off” from such groups. Doing so may be enhanced if tools like Twitter, Socrative, Nearpod, or again Turningpoint are used to collect “key takeaways” from student discussions. See the textbook on the next page for an example of how Turningpoint (with ICTs) could be merged with non-technological discussion to facilitate peer instruction. 

As you read the scenario, consider: does the same basic structure seen with the use of ICTs in other situations uphold? Does the instructor…

  1. Identified a barrier for which ICTs could be useful and match an appropriate tool to address that barrier? 
  2. Offer basic training and offer resources for further support as needed?
  3. Create a routine?
  4. Provided boundaries (procedural steps and structures in this case)?

Summary and Interim Conclusion 

There is persistent evidence both in the literature and anecdotally to support a systematic framework approach to using ICTs intentionally. This process broadly includes using backward planning to clearly identify learning objective(s), recognize barriers that may restrict access to that learning objective for variable students, and utilize tools and methods (including, but not limited to ICTs) to intentionally address those barriers in order to maximize access. With further analysis, the procedure that may most effectively allow for the structured use of ICTs after identifying objectives involves four pragmatic steps: 

  1. Identify a barrier for which ICTs could be useful and match an appropriate tool to address that barrier. 
  2. Offer basic training and offer resources for further support as needed.
  3. Create a routine.
  4. Provided boundaries to keep ICT use and student activity appropriately focused. 

By shifting focus from the tools as an end unto themselves to the tools being a means to an end (the removal of barriers, improved access the learning outcomes), this approach may thereby provide guidance to support the notion of “structured use” of ICTs in the higher education classroom. 

 Holistic Conclusions 

In this three-part report, I examined the nature and outcomes of the use of ICTs in higher education, evaluated approaches to addressing distracting use of ICTs, and elicited a framework for effective use (and nonuse) of ICTs generally and in terms of specific barriers and corresponding digital tools to address such barriers. Each part resulted an interim conclusion, which together inform the position statement iterated at the beginning of this report:

  • Part I:  Banning technology and allowing unrestricted, unstructured use of technology are both non-solutions, each with their own set of pragmatic and ethical problems. There’s ample research-based evidence to suggest that this middle ground is likely to achieve the best results in terms of student outcomes as well as student and faculty satisfaction.
  • Part II: The “middle ground” may be better articulated as “structured use” of ICTs. This means critically identifying when an ICT is the right tool used for the right job at the right time - even if that sometimes means not being used at all. When a technology tool is warranted that would call for students to use their ICTs, providing clear, explicit instructions and guidelines for students to use their devices effectively to further their learning is needed.
  • Part III: The guidelines that may be most effectively used to structure positive use (and non-use) of ICTs include using backward planning through the identification of learning objectives, reflection on barriers, and the determination of ways to address those barriers. When ICTs are to be used, provision of training, establishment of routines and development of boundaries for use may be necessary in many cases in order to maximize benefit.

Series TOC:

Series Bibliography

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.