Skip to content

Composition Forum 48, Spring 2022
http://compositionforum.com/issue/48/

Synchronicity over Modality: Understanding Hybrid and Online Writing Students’ Experiences with Peer Review

Bookmark and Share

Jennifer M. Cunningham, Natalie Stillman-Webb, Lyra Hilliard, and Mary K. Stewart

Abstract: This study includes interviews with 70 undergraduate students enrolled in online or hybrid first-year composition (FYC) classes at one of four universities in the United States and analyzes students’ perceptions of digital peer review. Arguing that the Community of Inquiry (CoI) Framework is a logical heuristic for examining writing studies research, this study finds that synchronicity might be more significant than modality with respect to the ways that peer review is able to achieve social, teaching, and cognitive presence. Overall, this study suggests that synchronicity is a common thread woven throughout each of the CoI presences as a potential way of alleviating negative evaluations of and achieving a learning community through peer review. Data further suggest that hybrid and online students conceptualize relationships as creating a sense of community that is work-based rather than friendship-based, that students might not be aware of or able to foresee ways that peer review applies to other writing contexts or classes, and that instructors could better prepare students for peer review in classrooms and beyond.

Introduction

Peer review is foundationally accepted among writing scholars and composition instructors as important in theory and practice, given the underlying premise that writing is a social act and is a part of the revising process, producing better thinking and writing (Bruffee; Faigley and Witte; Gere; Murray). With the increasing number of hybrid and online classes, in addition to the use of tools like Google Docs, peer review continues to shift to a digital activity, irrespective of course modality. While the ways that students might accomplish peer review continue to change in terms of modality and available tools, the nature of peer review remains social and collaborative (Brammer and Rees).

With the intent of including student voices, this study consists of student interviews among four universities throughout the United States, to better understand their estimations of and experiences with peer review in hybrid and online first-year composition courses. Examining peer review through the Community of Inquiry (CoI) Framework provides insight into the ways that cognitive presence, social presence, and teaching presence apply directly to peer review. Additionally, since these data were collected pre-pandemic, they speak to the ways peer review was already digital while adding to future research as peer review continues to shift in a digital direction.

Peer Review

Generally understood in the context of writing studies and first-year composition (FYC) as the process by which classmates exchange texts and provide feedback about how to improve their writing, peer review is a common pedagogical activity incorporated into FYC classes (Brammer and Rees). The benefits of peer review are well researched and include improving elements of student writing (clarity, organization, focus, documentation, etc.) through a reciprocal process (Baker; Neumann and Kopcha) and learning to identify strengths and weaknesses in writing while crafting quality feedback (Stewart, Bridging Instructor Intentions). As Jason Wirtz explains, “Peer review is small group, collaborative work that is central to the writing classroom because of its emphasis on teaching writing as a process” (5). Speaking directly to the effectiveness of peer review, Kimberly M. Baker finds that students who participate in peer review make “more meaning-level changes to their drafts than surface-level changes” (188). Charlotte Brammer and Mary Rees find that students who regularly participate in peer review are “more confident in their ability to review peers’ papers” and consider the peer review process to “usually” or “always” be helpful (77). Peer review continues to be accepted as a good and necessary practice while shifting to electronic platforms and tools, based on the availability of digital devices and, thus, digital composing.

Peer Review is Already Digital

Given the ubiquity of digital devices like laptops, tablets, and smartphones, paired with increasing numbers of online and hybrid courses, students are composing texts electronically and, in many cases, peer reviewing those texts digitally, even when face-to-face. Scholars have investigated the use of peer review in different modalities. Over twenty years ago, Beth L. Hewett, one of the foundational online writing instruction (OWI) scholars, collected data from two upper-level writing courses in order to compare peer review feedback, what she called “peer talk,” between a traditional classroom and a networked computer classroom that used DOS-based software. While the study employed now-outdated software, its findings support a social constructivist view of peer review in that both groups were able to achieve “interdependence in their thinking about the world” and generate new ideas when interacting with their peers (284). Conducting a more recent comparison study, Ruie Jane Pritchard and Donna Morrow have investigated peer review among sixteen K-12 teachers, asking whether participant behaviors and feedback change when peer review is online or face-to-face. Overall, they find that “a group that works well f2f also works well online and vice-versa,” demonstrating, like Hewett, that peer review can be effective in either/both modalities (99).

While peer review can be effective in multiple modalities, Lee-Ann Kastman Breuch asks, “Does our understanding of peer review change when peer review is conducted in virtual environments?" (3). Breuch’s question is now, in many ways, even more important given the use of digital technology like Google Docs during face-to-face peer review and the shift to remote and online teaching during the COVID-19 pandemic. Breuch purports that although one might assume that peer review easily transfers from a face-to-face to a virtual environment, it does not. According to Breuch, “virtual peer review is fundamentally different in terms of practice” because peer review, which has traditionally been associated with spoken communication, is often associated with written communication in virtual environments. She calls for a revision of our understanding of virtual versus non-virtual peer review, which “is a matter of understanding differences of the environments and adjusting appropriately to those differences” (144).

Breuch also discusses the potential affordances of virtual peer review. For example, asynchronous peer review can eliminate the problem of time limits, benefit shyer students by including fewer nonverbal cues, and encourage students to offer more directive feedback (146). Both Breuch and Hewett maintain that synchronous peer review is better suited for brainstorming, while asynchronous peer review is better for close reading and detailed responses. While Breuch discusses virtual peer review as an asynchronous activity, our understanding of digital peer review is broader. Digital peer review can take place in face-to-face, synchronous online, and/or asynchronous online modalities. In that way, digital peer review can and does involve a combination of asynchronous and synchronous interactions that provide opportunities for students to construct knowledge. The affordances of digital peer review, paired with the social constructivist nature of peer review, make the Community of Inquiry Theoretical Framework a logical fit for understanding digital peer review in writing studies.

Community of Inquiry and Peer Review

The Community of Inquiry (CoI) Framework, a social constructivist model of online learning processes (Garrison and Anderson; Garrison, Anderson, and Archer), is becoming more recognized and applied among online writing instruction (OWI) scholars as a heuristic for understanding and assessing effective online and hybrid pedagogy (Stewart, Communities of Inquiry; Hilliard and Stewart; Stewart et al.). Because the CoI Framework is a validated heuristic for assessing student-centered online classes and shares much of writing studies’ own theoretical assumptions, it is well suited as a heuristic from which to examine digital peer review. OWI scholars have discussed the CoI Framework as it relates to teacher feedback (Cox et al.; Grigoryan), discussion forum design (Seward), the student-teacher relationship (Dockter), tools to promote community (Cunningham), and asynchronous and synchronous interactive activities in online courses (Stewart, Communities of Inquiry); our work contributes to those conversations by employing the CoI Framework to examine peer review.

Cognitive Presence

Cognitive presence is defined as “the extent to which the participants in any particular configuration of a community of inquiry are able to construct meaning through sustained communication” (Garrison, Anderson, and Archer 89). Cognitive presence relates to knowledge co-construction, which applies to the feedback students provide during peer review and whether/how they attend to it when revising. The ability to not only provide constructive feedback but to revise one’s own writing in light of new ideas is the ultimate goal of cognitive presence realized through peer review. For example, Yu-Fen Yang and Shan-Pi Wu note the effectiveness of peer review reciprocity, in that the act of reading and responding to peers’ writing results in more extensive revisions of students’ own writing. This kind of collaboration creates critical thinking and fosters learning through application.

While cognitive presence is associated with knowledge construction, it also involves applying knowledge to new and different contexts. In that way, cognitive presence in a first-year composition (FYC) course also relates to transfer. While transfer remains difficult to study empirically, Elizabeth Wardle maintains that one of our responsibilities as writing instructors and scholars is to engage with the issue of transfer. As it relates to peer review, cognitive presence could be understood as the ability to apply writing concepts and peer review techniques to classes and contexts beyond FYC. Kathleen Blake Yancey, Liane Robertson, and Kara Taczak call for instructors to explain the purpose and applicability of writing processes and techniques more clearly, which further underscores the importance of articulating the function of peer review beyond FYC.

Social Presence

Social presence plays an important role in enabling cognitive presence. As defined by D. Randy Garrison, Terry Anderson, and Walter Archer, social presence is “the ability of participants in the CoI to project their personal characteristics into the community, thereby presenting themselves to the other participants as ‘real people’” (89). Social presence is often conflated with the idea of establishing relationships, given that Short, Williams, and Christie first introduced and defined social presence as the “degree of salience of the other person in the interaction and the consequent salience of the interpersonal relationships” (65). Traditionally understood, social presence purports that online and hybrid classes must establish a feeling of collaboration, belonging, or trust in order to create a true community of learning.

As Wirtz observes, peer review is a pedagogical endeavor that aims to create a sense of community in a writing class. It is this sense of community created through collaboration that relates directly to social presence. Suggesting a clear connection between the CoI Framework and writing studies, Jennifer M. Cunningham has investigated whether Voki avatars create a sense of social presence in online FYC course activities, finding that tools mattered less than instructor feedback and immediate interaction and that students consider avatars “less ‘real’ than directly communicating with other students ... via ... peer workshops” (45). The students seemed to be more concerned with completing required activities than establishing relationships with their peers. In her investigation of whether the CoI Survey is a valid assessment tool for writing studies, Mary K. Stewart (The Community of Inquiry Survey) has found that students are “comfortable” online but are not necessarily learning how to write as a result of interacting with peers. Stewart suggests that students might not establish the kinds of relationships that writing instructors associate with collaborative learning. These findings corroborate CoI research that differentiates between social presence as a sense of belonging and trust versus social presence as the negotiation of multiple perspectives (Armellini and De Stefani; Peacock and Cowan). The ways that students conceptualize relationships in order to collaborate effectively may differ from the ways that CoI has traditionally interpreted social presence, which has implications for the efficacy of digital peer review.

Teaching Presence

According to Garrison, Anderson, and Archer, teaching presence includes both the design and facilitation of the education experience. Investigating students’ perceptions of teaching presence via video lectures in online classes, John Paul Steele, Sarah Nicole Roberston, and B. Jean Mandernach found that students in online courses may experience a stronger sense of community if instructors include personalized, supplemental videos. Similarly, Anna Grigoryan investigated teaching presence as it relates to student preferences between text-only feedback or both text and audio-visual feedback, finding that most students prefer both and consider audio-visual feedback to be more personal.

While not using the CoI Framework or terminology, Beth Brunk-Chavez and Shawn J. Miller’s research speaks to both teaching presence and social presence in terms of collaboration in online portions of composition courses that are either hybrid or face-to-face, taught in a computer classroom. Differentiating collaboration from cooperation, they argue that collaborative learning empowers students because “the authority over both the process and the product is transferred to the groups” (n.p.). They note further that collaboration “takes several forms in composition courses including ... detailed critiques of each other’s writing.” Of their surveys with six sections of composition students, they observe that students’ online “posts were thoughtful and interesting, but there is little or no indication that they were aware of their fellow classmates’ postings or even their existence.” They explain this shortcoming in what can be understood as teaching presence, writing, “unless the instructor purposefully sets out to design it, the course will lack a space for genuinely collaborative activities.” Specific to peer review, Pritchard and Morrow argue “that training students HOW to respond, whether f2f or online, is essential, and most of the rules that apply to f2f peer groups also [apply] to online response” (101). Understood through a CoI lens, these findings highlight the role of teaching presence—particularly through course design and delivery—in creating opportunities for students to collaborate, thus establishing social presence which can lead to cognitive presence.

In what follows, we examine digital peer review by applying the CoI Framework as a heuristic to make sense of the diversity of experiences reported by students.

Methods

We delivered the CoI in Writing Studies Survey, a modified CoI survey with quantitative and qualitative questions specific to writing classes, to 50 sections of hybrid and online FYC in Fall 2017 and 64 sections in Spring 2018. This research includes data from students who chose to participate in follow-up interviews. This study examines interviews with 70 students in order to determine how peer review directly relates to the three Community of Inquiry (CoI) presences. While this study is focused on perspectives of online and hybrid FYC students, our findings have broad implications for peer review, considering that peer review is often already digital regardless of course modality.

Research Sites

After obtaining IRB approval, we delivered the CoI in Writing Studies Survey to hybrid and online first-year composition (FYC) students across four public universities in the United States throughout the 2017-2018 academic year. Institution A is a four-year Mid-Atlantic university with over 12,000 undergraduate and graduate students. Institution B is a regional campus of a large, Midwestern university with an eight-campus system, with an enrollment of about 5,000 students. Institution C is a large, research university in a metropolitan area of the Rocky Mountains Region, with over 32,000 students. Institution D is a research university on the East coast with over 40,000 students. Institutions A, B, and C require a two-course composition sequence, while Institution D offers a one-course composition sequence. A variety of composition courses are represented in this study, including hybrid and online first- and second-semester composition taught by graduate students, part-time, NTT, and TT instructors. Throughout the manuscript, we will use the term “first-year composition” or “FYC” to refer to all of the courses in this study.

Data Collection

We delivered an IRB-approved CoI in Writing Studies Survey to 114 sections of online and hybrid FYC courses among the four institutions in the fall of 2017 and spring of 2018. The online courses were fully asynchronous; the hybrid courses involved one or two days of face-to-face instruction each week, with the rest of the course activities taking place online—either synchronously or asynchronously.

Students completing the survey were invited to participate in an individual follow-up interview, which is the focus of this current study. Interview questions (see Appendix) were not part of the original CoI in Writing Studies Survey and were devised to augment quantitative and qualitative data from the survey. Interviews were conducted online and audio was recorded by a researcher from an institution different from the student participant; they were then transcribed, with data de-identified prior to analysis.

Participant Demographics

Of the 2306 students invited to participate in the original survey, 669 students (29%) completed it, and 81 of those students chose to participate in follow-up, audio-recorded, online interviews. The following table includes demographic information for the interviewees.

Table 1. Interview Participant Demographics

Institution

22 (27%) A

5 (6%) B

14 (17%) C

40 (49%) D

Course Modality

44 (54%) Hybrid (H)

37 (46%) Online (OL)

Prior Hybrid/ Online Experience

35 (43%) none

15 (19%) 1 H/OL course

23 (28%) 2-4 H/OL courses

6 (7%) 5 > H/OL courses

2 (3%) not indicated

n=81

As part of our confidentiality protocol, we separated the survey from the interview data, such that the interviews in this study are attached only to pseudonyms. For that reason, we are not able to report on these participants' exact demographics, but we can report on the general demographics for the survey respondents and confirm that our interviewees seemed to be a representative sample of the larger population. Sixty-three percent of respondents identified as female, 36% as male, and 1% as gender non-conforming or preferred not to say. Almost all of the respondents were ages 18-22 (91%) and reported that English as their first language (89%). We regrettably did not ask about racial identity when we delivered the survey in the fall, but the students who completed the survey in the spring were asked to describe their racial identity in an open-ended question: 64% identified as Caucasian or White; 20% as Asian; 9% as Black or African American; 6% as Southeast Asian; 5% as Hispanic, Latina/o/x, or Mexican; 3% as bi- or multi-racial; 2% as Middle Eastern; and 1% as Indian. We also asked the spring respondents about their course load and first-generation status and found that 94% were enrolled full time and 18% were the first in their families to attend college.

Analysis

To analyze the interviews, all excerpts related to peer review were uploaded to Dedoose, an application that can be used collaboratively for analyzing qualitative data. One researcher engaged in open- and axial coding of excerpts, compiling a list of initial core categories related to peer review: Structure/Design, Evaluation, and No Peer Review. This researcher also identified initial subcategories that were discussed and revised among the rest of the team. Two researchers negotiated the final subcategories and used this coding scheme to re-code the entire dataset individually. After recoding, they noted all disagreements, and negotiated remaining differences. Overall agreement, the percentage of agreements between the two raters with regard to entire excerpts, was .78. The following results report on the frequency with which the students discussed a particular topic.

Limitations

While we mitigated potential negative effects on validity by ensuring confidentiality via pseudonyms and anonymizing data, we also recognize the potential impact of social desirability because students speaking with professors know that, for us, the “correct” response might be that peer review is important or helpful. Also worth noting is that interviewees’ comments about peer review occurred organically, and questions directly related to peer review were not included among the interview questions.

Results

A total of 81 FYC students participated in audio-recorded, online interviews, with 44 (54%) enrolled in a hybrid course and 37 (46%) enrolled in a fully online course. When asked questions about how they learned about writing by interacting with their classmates, the majority of students responded by discussing peer review. Among all 81 participants, 70 (86%) discussed peer review in some sort of evaluative way as helpful or unhelpful, and 62 (77%) participants discussed the structure or design of their peer review. Only 5 (2%) participants did not discuss peer review. The 70 students who discussed peer review evaluatively are the focus of this research.

While comments related to the Structure/Design category demonstrate the different techniques associated with incorporating peer review, there are so many ways that instructors include peer review and that students participate in peer review that parsing out or generalizing hybrid and online experiences is beyond the scope of what is possible with this data. Instead, we have chosen to take an approach that looks beyond tools or modalities and uses a heuristic (i.e., CoI) to make sense of the diversity of experiences that these students report. Subsequently, we focus on the Evaluation category, which provides insights into students’ experiences and perceptions of digital peer review in relation to CoI.

Evaluative Student Comments about Peer Review

In total, we identified 11 subcategories associated with the evaluative comments made by 70 student participants: Helpful in General, Relationships (Established and Not Established), Produces Better Writing, Insufficient Feedback, Different Perspective, Design (Helpful and Unhelpful), Modality, Peers Essays as Examples, Reciprocity, Honest Feedback, and Instructor Feedback. While 63% of the total comments provide information related to students’ positive evaluations of peer review (e.g., that peer review was generally helpful, resulted in better writing after revising, and provided opportunities to view peer essays as models), the subcategories comprising Positive Comments replicate previous findings regarding students’ positive perceptions of peer review (Stewart, Bridging Instructor Intentions; Brammer and Rees). The 37% of total comments that include negative evaluations of peer review shed more light on how peer review can directly relate to yet fall short of achieving a community of inquiry. Therefore, we chose to further examine Negative Comments as they relate to the CoI Framework, which helps us find constructive recommendations for how to improve when designing, introducing, and implementing peer review. Thus, rather than include all subcategories, we organize our discussion according to four specific subcategories that offer more insight into each of the three CoI presences and students’ negative evaluations of peer review: Modality, which applies to all three presences; Design, which speaks to teaching presence; Relationships, which aligns with social presence; and Insufficient Feedback, which offers insights into cognitive presence. The following table includes those subcategories associated with Negative Evaluations; the number and percentage of participants (out of the 70 who discussed peer review evaluatively) who discussed each subcategory; the description or definition of each subcategory; and example excerpts from each subcategory.

Table 2. Peer Review Negative Evaluation Subcategories, Frequencies, Descriptions, & Examples

Subcategory

Description

Example


ALL PRESENCES

Modality

n=16 (23%)

Discussed whether peer review is more effective online or face-to-face.

Prefer f2f (n=9; 6O, 3H)

“Because you read the feedback on the screen, it’s kind of cold, and you sometimes feel like they’re being mean about your paper. I’m sure if it was in person it wouldn’t sound—you wouldn’t feel as defensive as when you read it online”

Prefer OL (n=7; 5H, 2O)

“I feel like the lack of face-to-face interaction when you’re peer reviewing someone else’s essay makes it easier to judge other people’s essays without, like, making them feel bad about their work or get a bad review for their essay”


COGNITIVE PRESENCE

Insufficient Feedback

n=23 (33%)

Discussed ways that peer feedback was insufficient, too polite, or uncritical in some way. This included the additional child codes:

Peers Not Invested - Indicated that feedback was insufficient because their peers didn't care about peer review or take the assignment seriously.

Too Polite or Uncritical - Indicated that feedback was insufficient because peers were too nice or polite and not critical or helpful with their feedback, including giving vague or general comments.

Peers Lack Writing Experience - Indicated that feedback was insufficient because peers were inexperienced writers.

No Instructions - Indicated that feedback was insufficient because the instructor did not provide any directions or instructions about how to provide good feedback.

Peers Not Invested (n=16)

“People look at [peer review] as just an assignment to get over with and don’t really take time to read”

Too Polite or Uncritical (n=10)

“When you’re writing a paper, you don’t just want to hear, ‘Oh, this is a great paper, and I like what you did with this.’ If you’re going to change anything, from the rough draft to the final draft, you need some more criticism than that”

Peers Lack Writing Experience (n=8)

“I’ve been writing a lot longer than they have, a good five years longer than they have. So their remarks weren’t necessarily as pertinent as my instructor’s”

No Instructions (n=3)

“They usually weren’t that helpful [...] there was no requirement about what you had to do when you peer reviewed”


SOCIAL PRESENCE

Relationships Not Established

n=16 (23%)

Discussed that peer review was unhelpful in creating or maintaining relationships with peers or creating a sense of community.

“Interaction was weird because I would, like, give people comments on their paper and then see them in class the next day and no one would talk to each other”


TEACHING PRESENCE

Unhelpful Design

n=14 (20%)

Evaluated the overall peer review design or way peer review was conducted in their class.

“A lot of my peer reviewers weren’t even in my same section [...] they weren’t really my classmates”

n= 70

Modality consists of comments from 23% of the 70 participants who discussed whether they prefer peer review sessions that are face-to-face (n=9) or online (n=7). Some students asserted that online peer review was impersonal and that their peers’ comments could come across as harsh, while others opined that it was easier to provide constructive criticism at a distance. For example, John explained, “I mean because you read the feedback on the screen it’s kind of cold and you sometimes feel like they’re being mean about your paper. I’m sure if it was in person it wouldn’t sound—you wouldn’t feel as defensive as when you read it online.” Several students—enrolled in both online and hybrid sections—commented on the usefulness of discussions and debriefing conversations between writers and reviewers that are possible with face-to-face sessions yet lacking online. Sarah, for instance, described “an in-person class where everybody is contributing to the discussion and then, hopefully, we all walk away with a new idea for our papers or we think about something differently. I think it’s a little bit more difficult in an online setting than in the in-person class.” Interestingly, those students criticizing the lack of interactivity in online peer review focus solely on asynchronous feedback and not on synchronous digital workshops, which seems not to occur among these participants’ online or hybrid classes. This aspect of Modality and the ways that synchronicity applies to each of the CoI presences will be a focus throughout our discussion.

Insufficient Feedback includes 33% of the 70 participants who discussed peer review evaluatively. Students often asserted that their peers were not invested in the peer review process (n=16). As Kayla stated, her peers “look at it as just an assignment to get over with and don’t really take time to read.” Others expressed that the feedback was too polite or uncritical (n=10). A few noted that their peers lacked writing experience and thus were incapable of providing quality feedback (n=6). And some placed responsibility on the instructor, noting that they did not provide directions or instructions sufficient to enable quality feedback (n=3).

Relationships Not Established includes 23% (16) of the 70 participants. Some students associated the lack of success during their peer review with relationships and trust, like Kadijah, who said, “If I knew them and they knew where I was coming from, I would have trusted [peer review], but I have no idea who those people were, so I didn’t trust it that much.” More students, however, were not concerned with whether they established relationships with their group members in order to provide feedback. Fatima, for example, explained that the success of peer review relied very little on whether she established relationships (or, more directly, friendships) with her group members, saying, “Like, I might, if I got their peer review I would, like, I would know their name, but if I pass them on campus I wouldn’t be, like, ‘Oh, hi, John.’”

Design includes 26% of the 70 students evaluating the overall design of their peer review, with the majority (14 out of 18) indicating that the design of their peer review was lacking or problematic. Related directly to teaching presence, comments coded as Unhelpful Design demonstrate opportunities for instructors to rethink the ways they organize peer review in their classes. For example, Justin said, “I think in that way if we had an established—if there was an established framework already for us being used to talking and working together, it would make the peer review process a little better.”

In what follows, we explore in more qualitative depth students' perceived primary negative evaluations of peer review that became evident in our coding and how those evaluations relate to each of the CoI presences and to synchronicity.

Negative Evaluations of Peer Review and CoI

What is interesting about these findings is that participants’ negative evaluations of peer review relate more to synchronous versus asynchronous design than to hybrid versus online modality. Taken together, our data suggest that synchronicity is a common thread woven throughout each of the CoI presences—cognitive, social, and teaching—as a potential way of examining negative evaluations in an attempt to achieve a learning community through peer review.

Cognitive Presence: Insufficient Feedback

A clear pattern of evaluation emerged among discussions of insufficient feedback. Although half of the 70 interviewees were students in hybrid classes, the majority of students who discussed insufficient feedback (22 out of 34 or 65%) were in asynchronous online classes. And 16 of those 22 asynchronous online students who discussed insufficient feedback mentioned that their group members were not invested in the peer review. Lauren, for example, said, “A lot of times, people would just get through it so they would get the participation grade of peer reviewing someone else’s essay. I never felt like I got constructive feedback.” While Breuch suggests that asynchronous peer review encourages students to offer more directive feedback, this might not have occurred among these participants.

Pritchard and Morrow argue that hybrid courses might be most beneficial to writers, and when looking at responses related to Insufficient Feedback this could be reiterated in our data, although we suggest interpreting this finding through a lens of synchronicity. Similar to the data around uninvested peers, of the eight participants who mentioned that their peers lacked writing experience, six of those students were in online classes, which had no synchronous component. It may be that the students who were most positive about peer review were hybrid students whose instructors designed peer review so that it integrated both face-to-face and online modalities, which, in these hybrid classes, tended to translate to synchronous and asynchronous interaction, respectively.

Students working fully asynchronously (i.e., online in this study) also mentioned more than students with a synchronous component (i.e., hybrid in this study) that feedback was insufficient because it was too polite or uncritical (n=7) and that their peers lacked writing experience (n=6). Forty-three percent of participants who mentioned insufficient feedback articulated that their peers lacked writing experience. John explained this shortcoming, saying, “When you’re writing a paper, you don’t just want to hear, ‘Oh, this is a great paper, and I like what you did with this.’ If you’re going to change anything, from the rough draft to the final draft, you need some more criticism than that.” Violet noted that “I just noticed there was a lot of difference between writing capabilities, I feel like. So put those students into a peer review situation, maybe they’re not bringing everything to the table that other people are.” She continued, “I actually ended up submitting my second paper into the writing lab and got more in-depth feedback from that.” A couple of students closer to graduation felt their peer reviewers lacked experience with writing. Catherine, for example, said, “I’ve been writing a lot longer than they have, a good five years longer than they have, so their remarks weren’t necessarily as pertinent as my instructor’s.” At other times this lack of expertise was discipline specific, for example when Nicole noted that her peer reviewers lacked knowledge of the engineering topics she was writing about. This lack of investment or experience might be somewhat mitigated by offering students opportunities to work together synchronously to establish relationships and better understand each other’s positionality, which speaks to social presence.

Social Presence: Conceptualization of Relationships

When these students mentioned modality, we noticed a difference in how online and hybrid students conceptualized relationships, which points to ways that FYC students have different expectations related to social presence in their courses. While some hybrid students expressed that getting to know classmates through peer review gave them a level of comfort in the class by facilitating relationships, the perceived presence or absence of peer relationships tended not to influence the success of peer review feedback. Perhaps most pointedly, Sanna said, “I don’t really talk to them outside of class, but I still do talk to them online [through our LMS]; we have to respond to each other and talk and stuff. Then we actually talk about our topics together. It’s not friendship, I guess, but it’s like a coworker.”

When hybrid students critiqued the lack of relationships with their peers, it was often within the context of face-to-face class sessions where they wished they had more opportunities to communicate synchronously. Dylan, for example, mentioned that he wanted more time to establish relationships during face-to-face sessions: “I wish I had more time. Because we did a lot of the peer review and that sort of stuff online, a lot of the time in class had to spend with just lecturing. We just didn’t really seem to have time in class to get to know one another. I’m sure that there are ways that we could have, but we just didn’t.” Kiana echoed this sentiment, saying, “Interaction was weird because I would, like, give people comments on their paper and then see them in class the next day and no one would talk to each other.” Fully online students also commented on not having relationships with classmates—like Amy B who commented matter-of-factly of her classmates, “I wouldn’t say I’m that connected with them,” or Farrah, who noted of her interactions with peers, “It’s only if I have to reply to their draft, or something.” However, the online students did not express the same regret as students with synchronous opportunities at not getting to know peers better or evince an expectation of relationship as part of the asynchronous course.

Hybrid students discussed relationships more overall than online students, pointing to the implicit association of face-to-face classes with relationships, which further speaks to synchronicity. In other words, hybrid students were afforded the opportunity more easily to collaborate synchronously—however briefly—even if their peer reviews were conducted mostly asynchronously online. Hybrid students were not only more likely to mention the establishment of relationships in connection with peer review, but they also appeared to be more likely to have expectations of relationships with their peers that went unmet.

Huahui Zhao, Kirk P. H. Sullivan, and Ingmarie Mellenius, who studied asynchronous online peer review, maintain that “a significant degree of social presence” is necessary to help more easily support cognitive development “because social presence develops learners’ awareness of each other’s existence and contributions” (808). This assertion reiterates Anne Jelfs and Denise Whitelock’s conclusion that online learning performance improves when “a strong sense of social presence” exists, asserting that social presence creates “a strong sense of physical presence, promot[ing] a feeling of teamwork and [leading] to effective collaboration” (806). Our study suggests that the type of social presence necessary for the kind of peer review that results in cognitive presence during peer review might require some synchronicity to encourage thinking about and valuing relationships and community. This relates directly to teaching presence and the kinds of peer review design available to instructors when incorporating peer review in their courses.

Teaching Presence: Peer Review Design

Our data indicates that instructors could better prepare students for peer review. For a quarter (n=18) of the 70 students, peer review was not designed in a way that supported their success. For some, instructions for the use of technology for textual exchange were unclear or missing; for others, expectations for peer review were not clearly communicated, making peer review come across as a low priority or as an afterthought in the course curriculum.

In addition, three students mentioned that they did not receive any instructions or directions related to how to provide feedback for their peer review. While two of those students were in fully online classes and one was in a hybrid class, all were referencing asynchronous, online peer review in their comments. Thomas, for example, explained, “... [our instructor] didn’t really go into how to do a good peer review and so a lot of comments were kind of like just discussion comments of just highlighting a bit of text and saying, ‘Hey, I like this.’” Pete explained that the first peer review session did not work but that the second was more successful: “I don’t know if that’s because [our instructor] made a post saying the peer review, this is explicitly how you do it this time, but the first time it was not a great thing.” Although more data is needed to draw a confident conclusion, these findings suggest that we can do more to help model peer review processes and effective feedback; this may be particularly important when the peer review is asynchronous, but it likely applies to all classes regardless of modality. As Pritchard and Morrow argue, teaching students how to provide feedback “is essential” whether face-to-face or online (101). And, as Bedore and O’Sullivan have pointed out, writing instructors have a lot of work to do “in helping students understand” that peer review is “collaborative learning” and not “proofreading” (79).

Perhaps providing students with opportunities to work synchronously with peers would help incite more attention to the activity of peer review and inspire more robust feedback. As Scott Tunison and Brian Noonan suggest, students might find it “difficult to communicate complex ideas in an online environment, with their ability to question and comprehend detailed explanations limited by the lack of face-to-face interaction” (Wilson, Diao, and Huang 18). This relates directly to teaching presence in that providing these synchronous opportunities in online courses is a matter of course organization and design—teaching presence can create an opportunity for social presence (i.e., a synchronous meeting) that can lead to cognitive presence (i.e., critical, thoughtful feedback). Similar to what Stewart (Cognitive Presence) found when studying whether peer interaction facilitated knowledge construction among FYC courses, for cognitive presence to be achieved “teaching presence need[s] to more directly guide students toward a social presence that support[s] cognitive presence” (n.p.). Also worth noting, however, is that many online instructors do not have a synchronous option and are required by their universities to design fully asynchronous online classes. The students participating in this study who were enrolled in hybrid courses were, by design, offered opportunities to work in multiple modalities and synchronicities, whereas the online students often were only able to work, by design, online and asynchronously. This lack of synchronicity could be one reason why peer review in online classes received more negative evaluations than in hybrid classes. Following the move to remote instruction during the pandemic, when synchronous online instruction became somewhat ubiquitous and generally accepted, we are optimistic that online synchronous options will remain in place post-pandemic. Our hope is that, following the pandemic, instructors will have more agency and options with regard to synchronicity when designing their courses, and that this study might serve to support the use of synchronous peer review sessions in asynchronous classes.

Conclusion

Overall, this study finds that synchronicity matters more than modality in terms of the ways that peer review is able to achieve cognitive, social, and teaching presence. Our hope is that following the technological adaptations during the pandemic, institutions of higher education will allow instructors to have more agency regarding the kinds of pedagogies available when teaching online. Specifically, we are optimistic that more fully online courses will be able to incorporate opportunities for both asynchronous and synchronous interaction.

When considering cognitive presence, this study finds that including a synchronous component to otherwise asynchronous peer review might encourage more constructive feedback. Providing students an opportunity to work asynchronously affords them more time to construct and think about their feedback, whereas providing an opportunity for synchronous interaction affords students the ability to discuss and clarify their critiques. In that way, we recommend designing peer review—whether in a course that is hybrid, online, or face-to-face—that includes both synchronous and asynchronous interactions.

We also recommend that future research investigate the relationship between cognitive presence and transfer. We originally anticipated including a coding subcategory “Useful Beyond FYC” whenever students discussed how their peer review experience applied beyond their composition course, but we ultimately eliminated that subcategory after identifying only one comment with that code. Of course, we did not ask a question of interviewees related to transfer, nor were we conducting, as Wardle did, a more longitudinal study that would follow-up with them later. Since one of our responsibilities, as FYC instructors, is to articulate this transferability to students to help them understand the generalizability and applicability of peer review beyond FYC, we advocate for additional research to ascertain whether students intend to and/or do apply peer review techniques in other classes and contexts.

As instructors plan for social presence, providing opportunities for students to debrief synchronously with peer review group members might also create the sense of collaboration that could be lacking otherwise. That more hybrid students than online students discussed relationships in this study speaks more to synchronicity than to modality, not only in terms of the implicit association of face-to-face classes with relationships/collaboration, but to opportunities for synchronous interaction that are afforded to hybrid students more readily and consistently than to online students. With that in mind, we encourage instructors to consider that students can create a sense of community that fosters relationships, not necessarily of friendship but of collaboration and critique.

In building teaching presence, we recommend that instructors provide clear instructions and practice modeling peer review. Instructors can provide specific guidelines for peer review, including guiding questions or instructions on which elements of writing to focus, as well as suggestions for commenting, whether in marginal annotations of their peers’ text or in a paragraph response. Instructors can also provide students with peer review models, both in the form of effective past students' feedback (with permission), as well as the instructor's own formative feedback on student drafts. Because technology can be a barrier for some students, technology tutorials can be included with the peer review assignment, for example, instructing how to compose a new comment in a Google Doc or access an assigned peer’s draft within the learning management system. Moreover, we suggest that instructors not assume one mode of interaction is best, and, instead, consider designing opportunities for students to interact both asynchronously and synchronously during peer review.

This work, in some ways, is a snapshot of pre-pandemic online and hybrid teaching, since many institutions are now experimenting with different models, some of which might include more synchronous peer review in online classes. Future studies should look at the structure and design of peer review in multiple course modalities. While our study found that there are many different strategies being employed by first-year writing instructors to carry out peer review, we did not find definitive trends in students declaring a particular modality or tool to be more successful than another. It is useful for instructors teaching in a growing number of modalities to know that peer review can be implemented in a number of ways. Our responsibility as writing instructors is to reconceptualize the kinds of community-building necessary for successful digital peer review, to better prepare students for participating in peer review, and to help students understand the applicability of peer review beyond FYC; designing peer review that offers both asynchronous and synchronous interaction is an advisable strategy for achieving those goals.

Acknowledgements: This project was funded by a Conference on College Composition and Communication (CCCC) 2018 Emergent Researcher Grant as well as internal grants from Kent State University and Indiana University of Pennsylvania.

Appendix

CoI in Writing Studies Survey Follow-Up Student Interview Questions

  1. Background Questions

    1. Have you taken blended or online courses before?

    2. Why did you choose to take this course online/blended?

    3. What is your typical week like as an online/blended student?

  2. Teaching Presence Questions

    1. Tell me a little bit about your instructor.

    2. How do they participate in the course?

    3. How often do they direct you to interact with classmates?

  3. Social Presence Questions

    1. How do you interact with classmates?

    2. How would you describe your relationship with your classmates?

    3. What is the purpose of interacting with your classmates? Why do it?

  4. Cognitive Presence Questions

    1. How would you describe what you are learning in this course? What are you getting out of it?

    2. How does your instructor support your learning in the course?

    3. How do your classmates support your learning in the course?

End of interview question: What motivated you to complete the survey and volunteer for the interview?

Works Cited

Armellini, Alejandro and Magdalena De Stefani. Social Presence in the 21st century: An Adjustment to the Community of Inquiry Framework. British Journal of Educational Technology, vol. 47, no 6, 2016, pp. 1202-1216.

Baker, Kimberly M. Peer Review as a Strategy for Improving Students’ Writing Process. Active Learning in Higher Education, vol. 17, no. 3, 2016, pp. 179-192.

Bedore, Pamela, and Brian O'Sullivan. Addressing Instructor Ambivalence about Peer Review and Self-Assessment. Writing Program Administration, vol. 34, no. 2, 2011, pp. 11-36.

Brammer, Charlotte, and Mary Rees. Peer Review from the Students’ Perspective: Invaluable or Invalid? Composition Studies, vol. 35, no. 2, 2007, pp. 71-85.

Breuch, Lee-Ann Kastman. Enhancing Online Collaboration: Virtual Peer Review in the Writing Classroom: Global Questions, Local Answers. Online Education: Global Questions, Local Answers, edited by Kelli Cargile Cook, and Keith Grant-Davie. 2005. Baywood Publishing, 2017, pp. 141-156.

Breuch, Lee-Ann Kastman. Virtual Peer Review: Teaching and Learning about Writing in Online Environments. SUNY Press, 2004.

Bruffee, Kenneth A. Collaborative Learning and the ‘Conversation of Mankind.’ College English, vol. 46, no. 7, 1984, pp. 635-52.

Brunk-Chavez, Beth, and Shawn J. Miller. Decentered, Disconnected, and Digitized: The Importance of Shared Space. Kairos: A Journal for Teachers of Writing in Webbed Environments, vol. 11, no. 2, 2007.

Cox, Stephanie, Jennifer Black, Jill Heney, and Melissa Keith. Promoting Teacher Presence: Strategies for Effective and Efficient Feedback to Student Writing Online. Teaching English in the Two-Year College, vol. 42, no. 4, 2015, pp. 376-391.

Cunningham, Jennifer M. Mechanizing People and Pedagogy: Establishing Social Presence in the Online Classroom. Online Learning, vol. 19, no. 3, 2015.

Dockter, Jason. "The Problem of Teaching Presence in Transactional Theories of Distance Education." Computers and Composition, vol. 40, 2016, pp. 73-86.

Faigley, Lester, and Stephen Witte. Analyzing Revision. College Composition and Communication, vol. 32, no. 4, 1981, pp. 400-414.

Garrison, D. Randy. E-learning in the 21st Century: A Framework for Research and Practice. Routledge, 2003.

Garrison, D. Randy, and Terry Anderson, and Walter Archer. Critical Inquiry in a Text-Based Environment: Computer Conferencing in Higher Education. The Internet and Higher Education, vol. 2, no. 2-3, 1999, pp. 87-105.

Garrison, D. Randy, Terry Anderson, and Walter Archer, W. Critical Thinking, Cognitive Presence, and Computer Conferencing in Distance Education. American Journal of Distance Education, vol. 15, no. 1, 2001, pp. 7-23.

Gere, Anne Ruggles. Writing Groups: History, Theory, and Implications. Southern Illinois University Press, 1987.

Grigoryan, Anna. Audiovisual Commentary as a Way to Reduce Transactional Distance and Increase Teaching Presence in Online Writing Instruction: Student Perceptions and Preferences. Journal of Response to Writing, vol. 3, no. 1, 2017, pp. 83-128.

Hewett, Beth L. Characteristics of Interactive Oral and Computer-Mediated Peer Group Talk and Its Influence on Revision. Computers and Composition, vol. 17, 2000, pp. 265-288.

Hilliard, Lyra, and Mary K. Stewart. Time Well Spent: Creating a Community of Inquiry in Blended First-year Composition Courses. The Internet and Higher Education, vol. 41, 2019, pp. 11-24.

Jelfs, Anne, and Denise Whitelock. The Notion of Presence in Virtual Learning Environments: What Makes the Environment ‘Real.’ British Journal of Educational Technology, vol. 31, no. 2, 2000, pp. 145-152.

Murray, Donald M. Making Meaning Clear: The Logic of Revision. The Journal of Basic Writing, vol. 3, no. 3, 1981, pp. 33-40.

Neumann, Kalianne L., and Theodore J. Kopcha. Using Google Docs for Peer-then-Teacher Review on Middle School Students’ Writing. Computers and Composition, vol. 54, 2019, pp. 1-16.

Peacock, Susi and John Cowan. Promoting a Sense of Belonging in Online Learning Communities of Inquiry. Online Learning, vol. 23, no. 2, 2019, pp. 67-81.

Pritchard, Ruie Jane, and Donna Morrow. Comparison of Online and Face-to-Face Peer Review of Writing. Computers and Composition, vol. 46, 2017, pp. 87-103.

Seward, Dan E. Orchestrated Online Conversation: Designing Asynchronous Discussion Boards for Interactive, Incremental, and Communal Literacy Development in First-Year College Writing. ROLE: Research in Online Literacy Education, vol. 1, no. 1, 2018.

Short, John E., Ederyn Williams, and Bruce Christie. The Social Psychology of Telecommunications. Toronto: Wiley, 1976.

Steele, John Paul, Sarah Nicole Robertson, and Jean B. Mandernach. Fostering First-Year Students' Perceptions of Teacher Presence in the Online Classroom via Video Lectures. Journal of The First-Year Experience & Students in Transition, vol. 29, no. 2, 2017, pp. 79-92.

Stewart, Mary K., Jennifer M. Cunningham, Lyra Hilliard, and Natalie Stillman-Webb. How and What Students Learn in Hybrid and Online FYC: A Multi-Institutional Survey Study of Student Perceptions. College Composition and Communication, 2022, forthcoming.

Stewart, Mary K. Bridging instructor intentions and student experiences: Constructing quality feedback, evaluating writing features, and facilitating peer trust as goals of peer review. Journal of Response to Writing, vol. 5, no. 2, 2019, pp. 72-102.

Stewart, Mary K. The Community of Inquiry Survey: An Instrument for Assessing Online Writing Courses. Computers and Composition, vol. 52, 2019, pp. 37-52.

Stewart, Mary K. Communities of Inquiry: A Heuristic for Designing and Assessing Interactive Learning Activities in Technology-Mediated FYC. Computers and Composition, vol. 45, 2017, pp. 67-84.

Stewart, Mary K. Cognitive Presence in FYC: Collaborative Learning That Supports Individual Authoring. Composition Forum, vol. 38, 2018.

Tunison, Scott, and Brian Noonan. On-line Learning: Secondary Students’ First Experience. Canadian Journal of Education, vol. 26, no. 4, 2001, pp. 495-511.

Wardle, Elizabeth. Understanding ‘Transfer’ from FYC: Preliminary Results of a Longitudinal Study. Writing Program Administration, vol. 31, no. 1-2, 2007, pp. 65-85.

Wilson, Michael John, Ming Ming Diao, and Leon Huang. ‘I’m not here to learn how to mark someone else’s stuff’: An Investigation of an Online Peer-to-Peer Review Workshop Tool. Assessment & Evaluation in Higher Education, vol. 40, no. 1, 2015, pp. 15-32.

Wirtz, Jason. Writing Courses Live and Die by the Quality of Peer Review. Collaborative Learning and Writing: Essays on Using Small Groups in Teaching English and Composition edited by Kathleen M. Hunzer. Jefferson, North Carolina: McFarland & Company, Inc., 2011, pp. 5-16.

Yancey, Kathleen Blake, Liane Robertson, and Kara Taczak. Writing Across Contexts: Transfer, Composition, and Sites of Writing.: Utah State UP, 2014.

Yang, Yu-Fen., and Wu, Shan-Pi. (2011). A Collective Case Study of Online Inter-Action Patterns in Text Revisions. Educational Technology and Society, 14(2), 1-15.

Zhao, Huahui, Kirk P. H. Sullivan, and Ingmarie Mellenius. Participation, Interaction and Social Presence: An Exploratory Study of Collaboration in Online Peer Review Groups. British Journal of Educational Technology, vol. 45, no. 5, 2014, pp. 807-819.

Bookmark and Share

Return to Composition Forum 48 table of contents.