Skip to content

Composition Forum 16, Fall 2006
http://compositionforum.com/issue/16/

Embracing the Exit: Assessment, Trust, and the Teaching of Writing

Bookmark and Share

Joseph Eng

Introduction

In the summer of 2002, my e-mail post on the WPA-listserv inquiring about published overviews of exit assessment practices pursued at different colleges and universities yielded little response, suggesting that perhaps there was no documentation of such. My search into CompPile did not lead to any record either. At Richard Haswell’s suggestion, I decided to do a brief survey on the List and then, based on the initial responses, I would follow up by contacting WPAs directly for more specifics about their practices, locations, and histories. Seeking to be better informed for our mid-size composition program in eastern Washington, I was interested in situated practices, essentially how something has been done, is being done, and why. More specifically, I would like to implement an exit writing portfolio and help new teachers understand the context of writing assessment further, from the very beginning, at the pre-fall teaching workshop. I got much more than expected.

Among the responses to the 3-question survey (Appendix A), one returned email stood out from the rest; with a “no” to all three questions, this program coordinator (a nationally recognized, well-published figure) adds, “we at the U. of —— trust our teachers and their teaching” (italicized mine). Implied, those of us interested in exit practices may not trust our colleagues. I sank into deep thoughts, all the while revisiting the assessment experiences I had had, the “thorniness” my previous chairs and deans had mentioned. Has assessment on teaching writing remained the same thorny, political and politicized issue after all the research and discussion for the past two decades? Or, to be more focused, is Exit measure ever possible among different teachers and classes of teachers? Is Exit, while required at many institutions and states, ever politics free? Naively I asked, could we ever embrace exit exams? Needless to say, I took the unsolicited survey comment seriously.

In the beginning, I was interested in practices done at different places as I helped our program transition from a single timed essay to a multiple-essay writing portfolio system. Because of the above comment, a needed balance between program uniformity and teacher autonomy quickly came to mind.

I had more questions to ask. How are the practices situated in and have evolved from different contexts? How does a certain exit measure, for instance, make sense (or not make sense) because of its local factors? And, back to the unique survey response, could “trust” be better understood within contexts of assessment and accountability, of teaching and training, of writing and evaluating? By implication, might all these questions point to sets of contraries including, for instance, how we view the teaching of writing as an on-going practice (assessing?) but the learning of writing is judged as an end-of-term performance (evaluating?); how program administrators make program (progress) reports but deans want to see end-result (budget) justifications; and, perhaps from students’ perspective, how writing has been taught as a process with ungraded drafts and yet the paper is given a grade when it’s done as a product? Peter Elbow’s 1983 College English article “Embracing Contraries in the Teaching Process” certainly points to the fine line between teaching and evaluation; a similar fine line exists, it seems, between assessment and evaluation, as we in eastern Washington have sought to benefit from the complex relationships among writing, teaching, and exit assessment. In addition to student writers and their advisors and mentors being “stakeholders” within portfolio employment (Huba and Freed 247), I argue that, in many university contexts, students’ graduate-student “professors,” non tenure-track lecturers, and other program administrators should be active participants in the process, since specific designs of portfolios could indeed serve both course and program assessment purposes (233-268).

If best teachers need to embrace both their loyalty to students as their allies/hosts and their commitment to knowledge and society as guardians/bouncers (Elbow 328), shouldn’t we, as committed instructors and WPAs view writing assessment, pursued both formatively (as teaching) and summatively (as grading), the same way? Couldn’t contextualized views and practices—situating formative and summative assessments as different but closely related practices within specific contexts—be emphasized to new teachers who are about to face teaching contraries? My answer is a definite yes. Arguing assessment as research, Brian Huot maintains that “writing teachers and administrators should see writing assessment as part of their responsibility and should initiate assessment efforts in the same way as they might revise curriculum, supervise instruction, or attend to other tasks important to effective educational programs” (178). Such a perspective entails a few predominantly “qualitative” guiding principles for the “new generation of assessment programs,” Bob Broad underscores, being “Site-based; Locally controlled; Context-sensitive; Rhetorically based; Accessible …” (quoted in Broad 13).

Currently, our program offers such a localized and encompassing opportunity by overlapping assessing, evaluating, and teaching, therefore bridging interests and practices between teaching and evaluation, course and program, and training and mentoring. For our pre-service teaching workshop, we have learned to situate our exit portfolio and its expectations within the entire quarter of assessment experience, involve more people in our portfolio subcommittee, collect and publish program research data, and share our practice at different forums (including an assessment workshop at the 2003 4Cs). This article, then, recaptures the transition in our exit measure, its history and local context, and its place in our pre-service teaching workshop. As a reference, our program practice may be viewed contextually, thus carrying implications for the development of similar practices in mid-size writing programs.

Context

The Pre-Fall Teaching Workshop

At Eastern Washington University, a comprehensive university in Cheney, WA, the English Composition program offers over 150 annual sections of writing in a quarter system, employing an average of 40 instructors including graduate teaching assistants, full-time lecturers, and adjunct instructors. Each late summer the Program arranges a mandatory 3-Day Pre-Fall Teaching Workshop (Appendix B) for all new incoming graduate teaching assistants and instructors. The Workshop consists of an average of 10 hourly sessions per day, including brief presentations, hands-on sessions, and large and small-group discussions facilitated by volunteered administrators, faculty, and experienced graduate teaching assistants. Audience members average 25 people, who are mostly new graduate teaching assistants, with a few interested faculty and adjunct instructors.

Rationale and Session Description: E101 Writing Assessment

While the writing portfolio has been widely adopted as an integral part of the composition curriculum as discussed in the professional literature, especially by Brian Huot, Wolcott and Legg, Kathleen Blake Yancey, Edward M. White, and others, its application and efficacy as an exit measure in First-Year Composition is not particularly explored. Nationally, many writing programs either have no exit measure or base the entire exit exam on a single high-stake, mostly argumentative writing sample (again, see Appendix A). Despite its apparent popularity in instruction, the writing portfolio does not seem to be universally adopted as a holistic exit measure. Among places where there are exit requirements, few adopt a writing portfolio with clearly defined common requirements; the few exit portfolios, if in place, are certainly not trade-read by all instructors, much less with detailed rubrics usually associated with such practices. (Related but beyond the scope of this paper, Bob Board reports his research about a rubric-free portfolio assessment program at a large research university, called the Dynamic Criteria Mapping model.)

A 2003 discussion thread on the WPA-L posted by a writing program administrator from the California State University System solicited help in devising a scoring rubric tailored to the exit portfolio; local community colleges in eastern Washington have exit portfolios based on anchoring portfolio samples but use no rubric. Further, discussions involving FYC exit assessment seldom explore the intricate relationships involving first-year composition curricula, training of teaching assistants, and writing program goals. Our training session coverage on assessment, as a major component of the pre-fall teaching workshop, therefore includes the following components:

  1. Defining inherent challenges (see next section “History”) regarding the English 101 exit exam of a mid-size composition program, supported by 28 TAs and 8 lecturers, at a regional comprehensive university in the northwestern United States.
  2. Introducing, through hands-on scoring sessions, two different but closely related assessment tools currently embraced in our new curriculum, namely the Shared Criteria (Appendix C), an analytical scoring tool, and the Exit Portfolio Scoring Guide (See “Discussion” section), a holistic descriptive rubric, for different purposes.
  3. Sharing assessment results from pilots: one summer term and one full academic term.
  4. Situating the need for both formative and summative assessments in effective writing instruction.
  5. Arguing for an overlap between assessment and evaluating.

History of Exit Assessment in Our Program

Historically, the Composition Program required a single essay sample from each composition student as the final exit exam; in practice, a student passed or failed the course based on an in-class argumentative essay, written in three consecutive class periods. Such a practice had enjoyed some success, reflected in part by the program pass rate, on an average, at 85% from around seventy E101 sections annually since 1999. Within the curriculum and teaching contexts, however, many instructors including teaching assistants, lecturers, and different program directors continued to express pedagogical concerns, especially regarding the wide-ranging individual class pass rates, the only and therefore privileged classic argumentation genre, and the growing interest of incorporating a common writing portfolio.

In order to further articulate relationships between learning and teaching, genre and voice, staff training and program administration, I started gathering assessment information in 1999 from a variety of venues ranging from graduate seminars, in-service workshops, scoring sessions, and a few informal gatherings each quarter. As a result, I drafted a transition plan in Spring 2000, thus beginning the process instituting the Exit Portfolio by completing a small-scale pilot in Summer 2001, with preliminary data collected and analyzed, and by implementing another large-scale pilot in Spring 2002 together with its data analysis. Throughout the one-year period, the Portfolio Planning Sub-Committee, three lecturers and I, created necessary instructional and supportive documents.

Thus far, program research data (including student and instructor surveys and pass rates) continue to indicate that the Exit Portfolio satisfies the main goals of the Program, which are to teach writing as social discourse that values both process and product, enhance student ownership of text, underscore the holistic nature of revision, and embrace various genres in public and academic writings (largely based on the Writing Program Administration Outcome Statements published by the National Council of WPA in 2002). The following excerpt from the Exit Information Packet highlights key components for instructors:

1. The Program/Exit Portfolio and the Class Portfolio

The Program Portfolio is the end-of-the-term, across-the-board Exit Portfolio (see its contents in “Discussion” section); the Class Portfolio is your class requirement (for which you, as instructor, specify the contents, in addition to the required entries in the Exit Portfolio). For practice sake, the Exit Portfolio has a Midterm version.

2. The Exit Portfolio and Your Class Syllabus/Calendar

Specify the time windows for students to prepare the Exit Portfolio and your Class Portfolio in your syllabus by stating its contents, expectations, and due dates during a particular academic term.

3. Formative Assessment and Summative Assessment

The Program seeks to communicate student writing needs and instructor teaching needs transparently via two different but complementing kinds of assessment. Students need to concretize their strengths and weaknesses, through practice per paper, and instructors (especially new instructors) need to meet such identified needs, again per paper, as their teaching takes place during the term. At the end of the term, however, evaluation of student work must occur in order to make student placement decisions. The Program therefore practices both kinds of assessment by using, throughout the term, the Shared Criteria to assess individual papers and by using, at the term’s conclusion, the Exit Portfolio Rubric to assess Exit Portfolios. The Shared Criteria, as an analytic scoring guide, helps student writers formatively by providing scores in each of the four categories including focus, development and support, organization, and mechanics, while the Exit Portfolio Rubric, as a holistic scoring guide, serves the summative purpose of placing students into E201 based on their final Portfolios as end-of-the-term performance involving preparation, overall achievement, and reflection.

4. Exit Portfolio Entries and You

With the exception of the Reflective Essay, all entries in the Exit Portfolios, including Midterm and Final rounds, will have been scored by you as the class instructor, using the Shared Criteria. You may be able to help your students further by familiarizing them early in the term with the Shared Criteria as formative assessment, while reminding them of exit expectations.

5. Grading and the Exit Portfolio

The Composition Program calls for an Exit Writing Portfolio of three pieces, as specified, while leaving the instructors freedom and flexibility in devising their Class Portfolio for end-of-term course evaluation purpose. The Program also provides examples of course grade computation without requiring a particular shape the Class Portfolio should take for final course assessment. Once the requirements of the Exit Portfolio are met, the student will pass the course with a grade point of 2.0 or higher. For the course grade, therefore, the instructor will compute a given student's work by further assessing student performance based on a class portfolio, together with, typically, other major and minor writings assigned throughout the quarter. (Note: If a student does not meet all class requirements specified by the Program and the class syllabus, s/he may still fail the course despite a pass for the Exit Portfolio.)

6. Timing

During the week of mid-term, the in-class argumentative essay—as one of the required entries in the MT Portfolio—will be completed in three class periods, as done historically in the Department, with the in-class prompt distributed the day or weekend prior. Near the end of the quarter, another in-class argumentative essay will be completed the same way for the Exit Portfolio, leaving several days (before University Finals Week) for students to complete a reflective piece, for you to do more instruction, and for the Portfolio scoring, a trade-read, community event, on Friday. Using the Shared Criteria, you will need to complete scoring the in-class argumentative essays before each scoring Friday. Write scores for folder entries on the back of each essay or on a separate sheet (and initial your scores) to be included in the folder. Your job is to make sure that ALL entries, minus the Reflective Essay being the only exception, are scored by you before Friday.

7. Support

The Program maintains a high level of support through different resources and forums via graduate seminars, in-service workshops, the Portfolio Planning Sub-Committee, and the Composition Committee.

During the year of implementation, several assessment initiatives were answered in our assessment report. I have prepared a succinct version of the report including the Program goal, its assessment instrument, scores, and recommendations.

Connections to Teaching and Learning

Based on teacher and student survey responses (Appendixes D and E), we believe that the exit portfolio implementation has been a success. Both surveys covered, in their respective academic terms, the entire student population enrolled in E101 and the entire instructor population assigned to teach the course.

In the student survey, 51% of responses had the “most positive” experience pursuing the exit portfolio and another 35% had the “positive” experience, compared to 11% who considered their experience “less than positive.” These positive experiences reports correspond responses about their overall writing challenges; while their portfolio experiences remain positive, their overall writing challenges remain most positive at the 58% and positive at 28%. Only 7% reported a less-than-positive experience for their overall writing challenges. Perhaps more importantly on the question of student preference, an overwhelming 85% of students would prefer the writing portfolio to a single-essay argumentative piece if given the choice. Most students also responded positively in the comment section, supporting the changes for more instructor feedback, more opportunities to revise, and the inclusion of a student-chosen entry.

These student responses apparently parallel instructor responses in the second survey among teachers titled, “Relationship between the New Exit Portfolio and Instruction.” While only half of the instructors (mostly TAs) had pursued the exit writing portfolio for the first time, their overall experience, considering its potential impact on their teaching, was observed at mid-range, from 3 to 4 on the 5-point scale, suggesting a moderate shift in pedagogies in order to accommodate the exit requirement. Impacted areas of instruction included mostly one-on-one conferences, assessment of student work, expectations, and timing. Narrative responses indicate that more work, not less, in each of the impacted areas are needed with the exit portfolio in place. But instructors did not seem to mind the needed work. Furthermore, an overwhelming 96% of instructors would prefer the current exit practice to the single-essay exit. Several instructors commented on how the two assessment tools assisted them as teachers by addressing student needs; specifically, while Shared Criteria helped emphasize needed attention in certain skills categories, the Scoring Guide for the Exit Portfolio offered a holistic assessment for end-of-quarter placement purpose. Instructors’ major concerns include a needed time window and a pedagogy for them to guide students for completing the required pieces without teaching to the test and a revision of the reflective assignment as a critical new genre (Huba and Freed; Black, et al; White et al; Yancey and Weiser).

For our upcoming PreFall Teaching Workshop, we will have a team of instructors including TAs and faculty colleagues help explain our system of assessment and share teaching and assessment results. In the spirit of authencity and collaboration, we plan to include a few E101 student writers in future workshops so that, in addition to survey responses, new and seasoned instructors could benefit from interacting with students in person before the beginning of the term. All these initiatives should further inform a program that commits itself to articulating teaching, learning, and assessment as closely related and productive work.

Endnotes

At the half-day workshop we conducted at the 2003 Conference on College Composition and Communication (Appendix F), audience gave us the best of encouragement by noting the strengths of our presentation, the hands-on segments, and the rubrics. The only reservation or query was our seemingly lack of accommodation for ESL/FL student writers. We have to acknowledge the weakness because we do not have the data to identify their specific needs; and, our only excuse, despite a nationally recognized ELI program and a rigorous ESL reading and writing program on our campus, is that we already designate two ESL/FL E101 sections annually, staffed by specialist instructors, based on soft enrollment of around 14 students (compared to the average of 20 students enrolled in the majority of regular sections). In another sense, these students are not particularly visible in the statistics offered in this article. Because of the workshop feedback, we are now looking into the design of assignments themselves first, the ranking rubric, and then our overall curriculum, from the ESL/FL perspectives. More articulation among Composition, academic advising, and our MA-TESOL program seems warranted. (One bridging effort resulted from this beginning discussion, for example, led to tutoring opportunities for our ESL/FL students involving our M.A. candidates in the graduate program in TESOL. At the time of this writing, tutors meet with the program directors of Composition and TESOL periodically in order to address pedagogical, tutoring, and assessment needs. Future plans may include a formal tutoring component within the current MA-TESOL practicum.)

In sum, our program desires a dialogic practice by embracing the many contraries as coexisting pairs. We answer administrative concerns for uniformity, measurability, and accountability by instituting an exit exam with common assignments and a trade-reading protocol, but maintain classroom autonomy by underscoring the formative Shared Criteria, the student-chosen portfolio entry, the instructor-designed Class Portfolio, periodic surveys, and committee maintenance work. The most recent report regarding our portfolio practice, dated Dec. 3, 2005, is cited within the full report of an external Composition Program review. In particular, the reviewer comments that:

“[t]he portfolio basis of English 101 provides a model of how to perform this strategy [leading to “excellent results”]. The results … indicate that not only do a very high percentage of students pass, but that the introduction of the portfolio program has increased pass rates in [the course]…. In addition, the graduate program offers what is obviously an effective seminar in pedagogy…. [S]tudents in this program have excellent attitudes about teaching and knowledge of how to teach…. [The Director] indicated that he plans to introduce the exit portfolio into English 201. Granted the success in English 101, this seems a reasonable move. The portfolio causes students to reflect on their work, thereby increasing the probability that they will internalize the lessons learned through the work in the class, enabling them to apply those lessons in other writing situations later in their academic or professional careers…. The Composition Program has an excellent balance of currency and future plans…. The pedagogy seminars clearly develop knowledgeable teachers, and the exit portfolio is an excellent strategy for obtaining long-lasting change in student writing attitudes and skills” (6).

While we are certainly happy with the reviewer’s comments, we continue to stress the importance of new and seasoned teachers being stakeholders. We will need to offer development opportunities that are both meaningful and rewarding, for all kinds of instructional staff and for all profiles of student writers. That is, promising rigorous pedagogical projects, conferences and publications as end-result “deliverables” may be enticing enough for tenure-track faculty; additional monetary and/or service-record incentives definitely help non tenure-track lecturers and part-timers participate more fully in workshops and seminars supporting further development of assessment projects. From the student writer standpoint, the relationship between the Shared Criteria and the Portfolio rubric needs to be transparent for their understanding and practice. Frequent discussions about the Reflection essay as a self-evaluative assignment, where students show evidence of their learning and ownership in a developing sense, and the instructor’s role in valuing the assignment, both empirically and empathetically, need to be complexly understood in order to ensure an effective student-teacher partnership (Huba and Freed 252-255). To encourage more focused reflection based on learning, we continue to improve the Program’s reflective essay prompt, incorporate in-class workshops on the reflective piece, and support individual instructors’ designs of the Class Portfolio.

Assessment and Trust

Currently, we ask our new recruits from the very beginning at our Pre-Fall workshop to take charge of the exit initiative by planning early, documenting progress, and indeed sharing concerns and victories with us via graduate seminars, in-service workshops, and the Composition Committee. We invite seasoned instructors to participate in our mentoring program, facilitate workshops, and seek faculty development venues within and beyond Composition. In the future, we may seek additional resources available at other university offices or programs such as our Graduate Programs Office, the Teaching and Learning Center, and our Writers’ Center, which, as stand-alone campus-wide units, could support our teaching and assessment initiatives through thematic workshop series via funded cooperatives or honorariums.

Apparently, we at EWU have benefited from the multiple overlaps naturally existing within any implementation of assessment practices. This otherwise local journey of ours began seven years ago when I was a new WPA at the school, hearing informally graduate instructors’ and their experienced colleagues’ concerns (and complaints) about the in-class timed argument being the exit exam. Because of our interest in connecting assessment and evaluation, as the former deals with teaching and the latter with placement, we implemented the use of both the Shared Criteria and the Exit Portfolio, situating these practices in various workshops, seminars, and mid-term assessment, and publishing results periodically. We then continue to expand on our practices by identifying future stakeholders and resources. When we, graduate students, lecturers, and tenure-track faculty (and sometimes mid-level administrators) get together, formally and informally, we have a common language that include particular writing skill areas, teaching or tutoring techniques to address these areas, and even administrative tools including teaching observation and program review. That is, with the Shared Criteria and the Portfolio guidelines as situated and reciprocal learning and instructional tools, we can now document and promote best practices by communicating openly the program and course expectations as common goals, which are accessible and attainable by our students and their instructors. Collaboratively and reflectively, then, assessment and trust can indeed co-exist; in fact, where we are, each is maintained and reinforced by the other.

Appendices

Appendix A: Exit Assessment in First-Year Composition: A Brief Survey on the WPA-Listserv

Returned E-mail Responses as of Sept. 23, 2002
School Name Program-Wide Exit Measure (Yes/No) Portfolio Design (Yes/No) Rubric in place (Yes/No)
ClemsonNNN
CSU Chico NNN
CSU Humboldt YYY
East Conn. SUYYY
Eastern Wash. U.YYY
U. of FindlayYYY
Gonzaga U.N (dropped, 2001)NN
Hannibal-LaGrange NNN
Highline CC (WA)YY (no common assignments)Y
Illinois State U.YYY (not trade-read)
Louisiana SUNNN
Lake Superior SUN (but “in process of developing one”)N (“could be”)N
Moravian CollegeNNY (for individual classes)
Morehead State U.NNN (“pre/post” tests)
North Harris C. (TX)N (“course before fyc does”) NN
Ohio State, MansfieldYYY (adopted WPA outcomes)
Quinnipiac U.NNN
Rutgers U.YY (“supervisory component for teachers”)Y (on website)
San Juan CollegeYYY
Stanford U.NNN
Texas Tech NNN
U. of MaineYYY (a “rough” rubric)
UMass-AmherstNNN (except in classes)
U. Michigan, DearbornYYY
U. Texas, CommerceNNN
U. Texas, Corpus CristieNNN
U. Texas, PanAmericanY (“dept. exam”)NN
Utah Valley SCNNN (except in some classes)
U. of UtahNNN
Villanova U.NNN
Washington SU PullmanNNN
U. Washington SeattleNNN

Notes:

  1. 11 out of 37 responses said YES to all three questions.
  2. 10 WPA volunteered additional information or remarks.
  3. 6 WPAs were followed with additional questions off-list.

Appendix B: Pre-Fall Teaching Workshop: Sample Agenda

Days Two and Three are more assessment specific. The three-day Workshop adjourns after Friday sessions.

Day Two: Getting Started: Teaching Writing, Class Scenarios, Campus Culture

Thursday, September 15, 2005

1. Breakfast/social: Coffee 9:00 a.m.

2. Teaching the Personal Essay: Additional Activities Danborn, Stokes 9:30 a.m.

3. Assessment: The Shared Criteria & the Holistic Scoring Guide Coy, Eng 10:00 a.m.

4. Assessment: Hands-on Session Coy, Eng 10:30 a.m.

5. Teaching the Point-of-View Paper: Brief Suggestions Corrick, Eng 11:30 a.m.

Lunch on your own Noon

6. The Weekly Assignments (hands-on) Corrick, Marr 1:00 p.m.

7. Library Instruction S. Milton 2:00 p.m.
(Place to be announced, JFK Library)

8. Graduate Studies reminders B. Donahue 3:00 p.m.

9. Anxiety Break Stokes, Sterner, Eng 3:30 p.m.

Workshop adjourns at 4:00 p.m.

Day Three: Special Interest Sessions

Friday, September 16, 2005
(Special Topics based on Audience needs via sign-ups)

1. Special Break-out Sessions: Bankston, Brown, Buckingham, Lee, Sterner, Eng 9:30 a.m.

Topics to be determined by participants. Previous years’ topics included: a. Assessment b. Class and Time Management c. Surviving graduate school

2. Using the Smartboard J. Williams, Griffin 10:30 a.m.

3. Using Blackboard, a courseware P. Lordan, Eng 1:30-3:00 p.m.

Appendix C: Assessment Criteria

Each of the following criteria has a 6-point range as its grading system, with 1 being the lowest in the “Low” category and 6 being the highest in the “High” category. An essay sample receives scores of 4,4,4,5, for instance, reflects middle mastery in Focus, Organization, and Development and Support, with high mastery in Mechanics. Since “analytic scoring” aims at revealing performances in specific skill areas for student learning purposes, the Composition Program does not recommend adding or averaging the scores.

Criterion 1: Focus

High: The paper has a clear central point. One main point clearly controls the entire paper, and the scope is manageable given the length of the paper and the nature of the assignment. The point is meaningful because it deals with an issue that the audience would likely consider important.

Middle: The paper is not completely controlled by one central point. A central point is evident, but all of the essay is not consistent with that point. The paper contains occasional digressions or irrelevancies. The paper might not stand out in terms of having a point in which readers would be engaged.

Low: The paper is not clearly controlled by one central point. The main topic of the paper is too broad given the length of the paper, or the central point is simply not clear. The paper may be fragmented, with multiple points receiving equal attention.

Criterion 2: Development and Support

High: The paper's major ideas are clearly and logically developed. The paper reflects sound reasoning, and the information is accurate. Readers should respect if not agree with the paper's logic. Major ideas are clearly explained through concrete, specific details. The support is tailored to suit the audience.

Middle: The paper's major ideas are unevenly developed. Major ideas are well developed as a whole, but occasional problems in support, explanations, or accuracy are likely to confuse readers or cause them to question the writer's logic. It is not altogether clear that the support is based around the designated audience.

Low: The development of the major ideas is lacking and/or confusing. Readers would likely find significant flaws in logic, accuracy, or explanations. Major ideas are barely supported or merely repeated. Generalizations are used when more specific evidence is needed.

Criterion 3: Organization

High: The presentation order is clear and logical. Paragraphs and sentences follow a reasonable, coherent sequence. Readers should rarely if ever question the connection between one idea and another. Transitions and/or headings effectively signal the relationships among the larger parts of the paper.

Middle: The paper has an order in which points are discussed, but the relationships are sometimes forced or unclear. The organizational scheme is recognizable, but some jumps in thought are difficult to follow. The writer has a sense of grouping ideas in paragraphs, but some transitions are awkward or unclear. The organizational scheme might be too formulaic and predictable to suit the situation.

Low: The paper is haphazardly or confusingly arranged. Readers struggle in connecting ideas, sentences, or paragraphs.

Criterion 4: Mechanics

High: The paper conforms to accepted conventions of grammar, punctuation, spelling, and capitalization in a variety of sentence lengths and types. A few minor errors may appear, but on the whole the paper follows accepted conventions. Readers will rarely if ever pay more attention to the paper's mechanics than to its ideas.

Middle: There are a few violations in grammar, punctuation, spelling, and capitalization. Although the paper has few errors, there are some in complicated sentences. The paper might contain some spelling errors. Readers will notice the errors but not so much that they discount the paper as a whole.

Low: Errors interfere with the credibility of the writer or with the meaning of the paper. There are grammar or punctuation errors even in simple sentences, and the meaning in a few sentences is not clear because of errors. Even some simple words might be misspelled. Readers will likely question (1) the writer's grasp of formal English or (2) the attention the writer gave the paper. (version 08/14/02)

Exit Reflection Essay

Purpose:
Similar to the midterm portfolio, you want to introduce you and your portfolio to your audience and to persuade your readers that you have accomplished the course goals and objectives—primarily. the Assignment Outcomes—as outlined by your instructor.
Task:
Your reflection essay should reaffirm what you have learned of the writing process, discuss items in your writing on which you are working, and indicate how you have improved as a writer through the use of the writing process this term. Your reflection essay should introduce your chosen essay for inclusion in the final portfolio. You should tell your reader why you selected this essay for them to read, what they should notice about the essay, what strengths and weaknesses are in the piece, and how you have improved the essay through multiple drafts. Finally, you might include what you consider will be the role writing will take in the rest of your tenure as a student and throughout your life and career. (Like the midterm portfolio, your audience for this essay is essentially anyone who would read your portfolio, but remember that your portfolio will be reviewed by your own instructor.)
Format:
Your reflection essay should be around two pages in length. As with all of your papers this quarter, the reflection essay should be guided by a single purpose. It should flow logically and easily from point to point. The essay should be well developed with details and examples demonstrating how your papers met the Outcomes, and it should be mechanically correct.
Assessment:
As explained in class, the entire portfolio will be assessed holistically. The reflection essay is an important component of the overall portfolio, providing direction and a frank self-assessment of your work for your audience. However, the reflection essay is not assessed outside of the context of the portfolio. As with the rest of your portfolio entries, it should reflect the concerns described by your professor in class.

Appendix D: Exit Portfolio Experience

A Student Survey, May 31, 2002

Dear Student:

The Composition Program is gathering information regarding the pilot application of the English 101 Exit Portfolio. In order to make effective implementation tailored to students' learning experience and Program assessment needs, we would appreciate timely feedback from student writers near the conclusion of this quarter. Please take a few minutes answering the following questions by reflecting on your portfolio experience. As a part of program assessment, your responses will remain anonymous and will not affect your course grade.

On a scale of 1-5, with 1 being the lowest/ most negative, and 5 being the highest/ most positive, rank the following issues—

1. Your “Exit Portfolio” experience this quarter

Results
responsenumberpercentage of respondents
5/most positive 23 13%
4 68 38%
3 64 35%
2 14 8%
1 6 3%
*Other 6 3%
Total 181 100%

2. Your overall writing challenges this quarter

Results
responsenumberpercentage of respondents
5/ most positive 22 12% (of respondents)
4 84 46%
3 50 28%
2 10 6%
1 2 1%
*Other 13 7%
Total 181 100%

3. As an Exit assignment, one that students must pass at the 101 level, which of the following designs would you prefer—if given a choice? Please circle one—

a. The current Exit Portfolio as a folder of essays, including a piece of your choice

Results: 154 (85%) selected this option

b. The Argumentative essay as the single required item, as done in previous terms

Results: 22 (12%) selected this option
5 (3%) *Other

4. Regarding the current Exit Portfolio design, should there be any changes in the future? If there should be modification, what would be your suggestions for learning sake?

Sample responses:

Thank you for your participation.

Appendix E: Relationship between the Exit Portfolio and Instruction

An Instructor Survey 3/15/02

Dear Instructor:

The Composition Program is gathering information during this first year of the English 101 Exit Portfolio. In order to gauge connection between exit assessment and instruction, we would appreciate timely feedback from instructors at this point, the conclusion of the winter quarter. Please take a few minutes answering the following questions by reflecting on your overall teaching experience, within the context of the E101 Exit Portfolio as an end-of-the-term measure. As a part of program research data, your responses will remain anonymous. Results will be shared in local and national forums. Joseph Eng

1. You as an Instructor:

Your staff status—circle only one:

Your E101 Exit Portfolio Experience at EWU—circle only one:

2. Your overall experience this quarter: The Exit Portfolio’s impacts on your method of instruction:

1 (lowest/non-significant) 2 3 4 5 (highest/most significant)

Why?

3. Where applicable, circle all areas of instruction impacted by the Exit Portfolio:

4. As an end-of-term exit assignment, one that students must pass at the 101 level in order to move on, which of the following designs would you prefer—If you must choose one of the two options? Circle one—

  1. The current Exit Portfolio as a folder of essays, including a piece of your choice
  2. The Argumentative essay as the single required item, as done in previous terms. Why?

5. Implementation: Please list any changes you would like to see in the future.

Appendix F: Sample responses from workshop participants following the Half-Day Workshop at the 2003 CCCC

#1 – What was your overall workshop experience?

#2 – What were the most effective or helpful sessions, and why?

#3 – What were the least effective or helpful sessions, and why?

#4 – Please rank your motivation for attending this particular workshop.

[mostly “very high,” with a few “average” responses.]

#5 – Any suggestions for future workshops?

Works Cited

Black, Laurel, Donald A. Daiker, Jeffrey Sommers, and Gail Stygall, eds. New Directions in Portfolio Assessment: Reflective Practice, Critical Theory, and Large-Scale Scoring. Portsmouth, NH: Boynton/Cook, 1994.

Broad, Bob. What We Really Value: Beyond Rubrics in Teaching and Assessing Writing. Logan, UT: Utah State UP, 2003.

Elbow, Peter. “Embracing Contraries in the Teaching Process.” College English 45: (1983): 327-39.

Huba, Mary E. and Jann E. Freed. Learner-Centered Assessment on College Campus: Shifting the Focus from Teaching to Learning. Boston, MA: Allyn and Bacon, 2000.

Huot, Brian. (Re)Articulating Writing Assessment: Writing Assessment for Teaching and Learning. Logan, UT: Utah State UP, 2002.

Riordon, Daniel. “Composition Program.” A Review of Programs at Eastern Washington University: Technical Communications, TESL, and Composition. December 2, 2005: 6-8.

White, Edward M., William D. Lutz, and Sandra Kamusikiri, eds. Assessment of Writing: Politics, Policies, Practices. New York: Modern Language Association, 1996.

Yancey, Kathleen Blake, and Irwin Weiser, eds. Situating Portfolios: Four Perspectives. Logan, UT: Utah State UP, 1997.

Bookmark and Share

Return to Composition Forum 16 table of contents.