I wanted to share this new post I wrote as a guest writer on the Teacher’s Discovery World Language Blog. I had the chance to reflect on a question that I am often asked about Teaching with Comprehensible Input and the AP World Language and Culture exam or class. So check out my thoughts about “If I Teach with CI/TPRS/Acquisition-Driven Instruction, Will My Students Be Ready to Take the AP® World Language and Culture Exam?”
As in any school year, I have tried many new ideas, techniques, and changes in my classroom environment. In fact, I am currently attending the wonderful Central States Conference to continue to learn more. In this post, I want to highlight 3 changes that I have done For Good! I am not saying that I am not still working through the kinks for some, but over all, I have seen positive results!
#1) I have finally gone “desk-less.”
So this year, I have tried the desk-less experience and I am very pleased with the results. There is so much to consider when making this change: acquiring the chairs, maintaining classroom management and dynamic, figuring out how to still have some flat surfaces in the room, and so much more. One aspect that I have been grappling with is does going desk-less benefit all students? My conclusions so far are the following: Firstly, I love the desk-less classroom and feel that it has been a positive change for student engagement and focus. I have much more control on student use of cell-phones during class and I think there is more engagement since students do not have the ability to put their heads down on a flat surface. Plus I think the communicative value of easily being able to chat with the neighbor helps break many barriers.
Another issue I’ve been dealing with is how to place the chairs. Since August, I have tried many different seating arrangements with the chairs. In fact I had to start the year off with uncomfortable school folding chairs before my newly ordered chairs arrived. My chairs are replacing rather large desks that are still around the perimeter in the room (which has been good and bad). I still want my students to have the “desks” or more ideally tables for longer essay writing class assignments. Since I have not yet been able to get tables, I still need the desks. And since they are there, many students want to sit in them. In the beginning I allowed some to try it and right away I saw a lack of engagement and some students falling asleep (which never happened with the chairs). So those students are no longer sitting in desks, but others are. I have found that many of the students with attention issues are faring better while sitting in desks that are on the sides in the room. These students have requested to sit in the desks and thinking about their overall success and some of their documented services requiring preferential seating, I have made a modified seating arrangement having 5 desks in play in addition to the chairs. If I see these students not finding success in the desks or taking advantage of the flat surface, I will talk with the students and then remove this option, but for now, I do believe the modification is helping a few students. Yet, I think the chairs are helping the majority of my students in class.
#2. My new bathroom pass procedure.
I have grappled with this one for the past five years in my current school setting. The “can I go to the bathroom phenomenon” is just a normal daily occurrence for us as teachers. Or course we are all left to wonder does this student really need to use the bathroom or do they just want to use the cellphone. When I am in the halls of my school, too often students are engrossed in walking around and texting. So in order to respect the need for my students to use the restroom, I have a trade your phone policy for the hall-pass. If a student wants to use the restroom, they put their cellphone in basket in the front of the room and take the hall-pass. I have been very pleased with the procedure and it has been implemented with very little pushback from students and I believe there are fewer students leaving my class because of it.
#3. Offering Extension Structures on Weekly Structure and Cumulative On-Going Structure Quizzes
I have a lot of internal debate about Structure Quizzes because students can easily just memorize a Quizlet set before class without really acquiring the structures. In order to meet the needs of my students with such a vast spectrum of Spanish and learning abilities in the same classroom, I have developed this system. Many of my students have reacted very well to the Structure Quizzes as I have set them up this year in my Spanish II, III, and IV classes (please know this is not my only way of assessing my students). Unfortunately, it is a bit complicated to explain the scoring, so I will try my best to explain it.
Working within the confines of our school schedule, I see students for 50 minutes 3 days a week and for one 90 minute block (so I see them 4 times). What I have determined is that most of my students can successfully learn, and for many acquire, 4 target structures a week with this type of schedule. But for many of my students, focusing on on only 4 target structures is too easy and not challenging them. So this year I began offering Extension Structures (one being my weekly Password Expression). These are structures that offer linguistic variables and/or additional pieces of language to help them communicatively. During first semester, I only included my weekly Password Expression and 3 other structures but for the second semester, I have doubled the extension structures to the Password plus 7 others.
My goal for all students is that they know the 4 target structures. If a student knows the “Weekly 4 target structures” then they have met my expectations and earned a 13/15 on their structure quiz, on which they must write the English meaning. I then have 4 extension structures on the quiz and they can earn 1 point her English meaning of the extension structures. So a student can exceed the class expectations and earn a “14,15,16,17” /15 on this quiz. But if a student misses 1 of the “4 target structures” then their extension structures are only worth “.5 credit” and since I will not work with halves in the grade book, they must get two extension structures correct for a full extra credit point. I have devised the system using the approach that the lowest score a student can earn is a 53% or a 8/15 [for missing all 4 target structures nor any of the extension structures]. Mathematically the point values help to have few failures and more students are pushing themselves to learn the 8 to 12 structures instead of simply the 4 target structures. Even when 12 structures could be assessed, I still only test 8 structures so it mirrors the 15 point quiz format with the math explained above. This same system applies to the weekly “On-going Cumulative Quizzes” (8 questions using a variety of 4 former target structures and 4 former extension structures).
What have I found?
I have found that providing the structures has helped focus many of my students and that most love and value the ability for extra credit and want to exceed the 13/15 – B score. When I surveyed students at the end of semester 1, they almost all reported that the chance for extra credit was a motivational way of extending themselves to learn more. Again we have to think about what helps the students with which we are currently working.
With regard to the “4 target structures,” I work very hard to help students acquire these structures during class. Now it is not same for the Extension Structures although my weekly Password Expression does get a lot of exposure. I do not focus on targeting my Extension Structures although they are often pulled from the 3rd or 4th version of an embedded reading, a linguistic connection/pattern that many students can learn, or something from our textbook that frankly not all students need to know. In some classes, I do use Quizlet Live as an activity with all structures because I keep an accessible On-Going Cumulative list per quarter for them on my Quizlet class page.
I feel the On-going Cumulative Quizzes are essential to help some long-term retention. As I have mentioned, I do not think all students are acquiring all structures, but I believe most are working to their full-potential and the extension structures are pushing the honors and gifted students in a way that I have not been doing as intentionally. I am still working through the correct amount of Extension Structures and how it helps or hinders overall long-term retention and motivation.
Here is an example of new Quiz Format for Target and Extension Structures and an explanation of the scoring guidelines.
As I started the post, these are 3 changes I have made this year For Good (the WICKED musical reference was fully intentional). Thanks for reading.
Short and sweet is the goal of this post. If you have followed me in the past, you know that brevity is not my strongest suit but today let’s hope.
I wanted to share an easy and effective way to test listening comprehension using a simple Quizlet derived “Test.” This means there is very little work for me.
To begin, I use Quizlet and love the many resources available to students by me just creating a new structure/vocabulary set. For the students, there are games, flashcards, practice tests, Quizlet LIVE as an in-class game, and now it even speaks to you in pretty good Spanish. For me as a teacher, I love that I can have class-pages, upload sets to Google Classroom, combine sets to easily create new ones and comprehensive lists for games like Quizlet LIVE or Around the World, and make tests in four formats: three types that allow for easy grading (which is great) multiple choice, true/false and matching, and also, fill-in, where students have to write.
This past semester I have taken advantage of the multiple choice format for listening comprehension quizzes. Here is how:
Using a Quizlet set (often my Comprehensive “On-going” quarter list), I generate a few “Quizlet multiple choice tests” with responses in English and print them out (FYI although the online version has bubbles for the multiple choice responses, they print with A, B, C, D). Making many versions of these tests happens with just the click of a button, so I then have tons of multiple-choice questions to chose from like this:
Target Language Structure: “Quería ser”
A) he liked to
B) he wanted to be
C) he needed to be
D) he wants to be
Then to make the listening assessment, I cut out the ones I want to use and stick them on a sheet of paper. Using a black marker (which is easier for me than white-out) I remove the “Target Language Structure” for all of the chosen multiple choice questions. Then I provide my students with a bubble/scantron sheet – we use the wonderful www.gradecam.com but I know there are so many other options for all teachers available.
When I give the assessment, I just make up sentences on the spot in target language and have the students choose the correct translation from the four choices. I do try to jot down my sentence so I can repeat it two to three times. So following my example above I would say in the target language “In his past, he wanted to be an artist” and students would bubble in “B” as the correct answer. To differentiate this, you could use more or less developed sentences.
What I have loved about this is the assessment reflects the structures I have taught and provides a quick listening assessment that is very simple to grade using a scantron/bubble sheet. It is not my only form of listening assessments for the quarter, but it is one that captures what has been taught. It has been win-win all around. Short and sweet!
I cannot believe that the semester is already complete, and of course my blog has not had much attention. I certainly know how I operate, and I need the personal challenge of setting goals with deadlines because without them too many other goals and priorities must happen first. This year I did not set the every month goal of posting, so during this time of New Year’s goals and resolutions – I will do some self-reflection with regard to all of my current projects and set some sort of goal for my 2019 blog posts. Now for my five year reflection on semester assessments!
The end of the semester always brings a mad rush of grading and assessments. I know, I know, if I would not assign 4+ assessments per prep, I would not have to grade so much. And yet, I still continue to assess, assess, and assess at the end of the semester – and overall I still think it is a good thing for both my students and me as an educator.
Five years ago, my department agreed to develop semester exams using a modified IPA or an Integrated Performance Assessment. I cannot say that we are using IPAs in the original way that ACTFL has described in their publication “Implementing Integrated Performance Assessments,” but I will say that many of the elements are there in a way to at least assess the modes of communication in addition to content of structure, grammar and vocabulary. In fact, when I look at only grading my students based on a multiple-choice structure, grammar, and vocabulary exam – their grades would be quite low and not reflect what they in fact know and can do.
So, with five years of tweaks and adjustments, I am pleased to see how my students’ midterm/final exam grades actually reflect just that – what they know and can do. With regard to quarter grades – I do not think they always reflect exactly what my students know and can do. Even though, I categorize and weigh my quarter grades based on “input – listening & reading (31%), output – speaking & writing (31%), content/structures (31%), and homework (7%)” – these grades tend to be higher than most of their midterm or final assessment grades. I do know that during the quarter, I always give my students opportunities to grow and improve (even by implementing a redo policy) so students show what they know and can do in an environment that celebrates the idea of showing what one knows and not what one does not. This all being said, I do want my midterm/final assessments to differ from the quarter grade. In my district the midterm and final exam grades are each 10% of the official final grade for the course. As you will see, I feel how I calculate this summative evaluation helps to more accurately report my students’ Spanish abilities.
In most of our world language courses, our midterm/final exams contain four sections being administered over four days (at least including one 90 minute block). The sections and grading percentages are as follows: Interpretive Reading 25%, Interpersonal/Presentational Speaking 25%, Presentational Writing 25%, and multiple-choice structure, vocabulary and grammar 25%. Although our students can prepare for this assessment by knowing how to communicate about a series of themes and knowledge of content structures, I feel the final calculated grade is a fair and valid representation of what a student knows and can do; rarely does a student’s grade not reflect this, and as I mentioned, it is a more accurate representation than a quarter grade.
In this current climate of constant student testing, I often think about how school or even my classes would and could be different without grades – even without the highly coveted A+ (97% – 100%+). Even though testing stresses out our students, they really have a hard time operating without the stress. It is similar to my early comments about needing to set goals for writing blog posts. Without deadlines and some accountability measures (quizzes and tests in most classroom settings), most high school students would not choose to study school subjects outside of school. Would today’s teenagers be different if they were trained from early on in their education to learn for the sake of learning and not for the sake of testing? I do not have this answer, but I do know that even I need deadlines and goals as a successful adult.
Ok, I have digressed enough from the topic of the actual assessment. Again, ours are not IPAs (Integrated Performance Assessments) in their truest forms but we do use the Interpretive Guide Template, which can be found here on the Ohio Department of Education’s site. I really love this template and think it does a fantastic job of helping students engage and delve into understanding a text. The set-up of the Interpretive Guide also assesses critical thinking skills that helps see world language learning in a true academic context, and yet at the same time when world language learning’s context is so academically charged it does not allow for all students to find success and attain a second language (more on that another day). Anyway, I think the Interpretive Guide does a great job to help make texts that are beyond a student’s reach more comprehensible. Students are required to identify keywords, the main idea, and supporting details while also making cultural, content-based, and linguistic inferences all based on any authentic text (an article, brochure, or website). Careful crafting of the Interpretive Guide must occur. For teachers, these exams do take a lot of time to prepare, but of course like anything, once created, one has a wonderful assessment piece.
With regard to the output-based sections of our exams, the tasks do depend on the language levels of our students. These summative Presentational/Interpersonal Speaking and Presentational Writing sections have been a great way to help our department focus and plan instruction based on the communicative tasks and expectations for our students. Following the Proficiency Guidelines, found at http://www.actfl.org, students at novice levels are able to reproduce more memorized and rehearsed language on these tasks. For both of the output tasks, our students are measured by their ability to provide evidence at sentence level and they can exceed our expectations by providing more developed sentences with accuracy for approximately 5 sentences per task. We use the rubric found on my resource page to assess the Interpersonal/Presentational Speaking and Presentational Writing sections. Each section is created using the agreed upon themes for the semester. In our level 1 and 2 courses, students are given the tasks ahead of time to prepare – so yes, for many students, they are using a mix of acquired and memorized, rehearsed language for which speaking is more Presentational than Interpersonal. I still believe that this is very appropriate for meeting the needs of all of my students and not just the strongest students. In all levels on the assessment days, students are not able to use notes or anything they prepared – my verb word-walls are not up for these exams either. In levels 3 and 4, the speaking task is more of an Interpersonal task where students must respond to unrehearsed questions based on the themes studied throughout the semester. Depending on the year and timing constraints, we have done one-on-one interviews with students and recordings. Live recording with the teacher reading questions and students responding via Google Voice, FlipGrid or Audacity has been the easiest way for us to assess their speaking with a large number of students. Overall, because of this type of assessment, we have seen better quality output from our students over the past four years. It has been positive to have these common assessments given twice a year instead of a more traditional chapter approach, which in my opinion would cripple me as a creative educator trying to meet the daily needs of my students.
A point of internal struggle for me has been whether or not to assess using a multiple-choice content section using structures and vocabulary from the semester. Over the past five years, I have worked through this personal dilemma and my conclusion is that I do in fact like an objective multiple-choice section based on content. We have a 90 minute block at which time we must give an exam and depending on which exam time-slot we have and when grades are due, the multiple-choice exam format has been needed but not without the other three components. I feel that the content-based sections of information are positive for students but in no way should be given without the other communicative tasks of the exam. Many of my students would fail the exam if it were only based on the content piece. Without the other three components, the content piece is not a true reflection of what my students know, can do, and can process in Spanish. Yet, having the multiple-choice section, even if simply generated by Quizlet’s test generator, helps my students organize, reflect, and, “gulp,” study for Spanish. It serves as a “deadline” and ongoing content component for the course (although this year I have been giving frequent ongoing structure quizzes which all students have reported as positively helping their growth). Having this comprehensive multiple-choice section, especially for the midterm, helps provide me with feedback on students’ recognition and retention. With these results, I am able to better plan my next quarter and reflect on the effectiveness of my instruction.
To sum this up, I could not go back to the way that I tested for many years – a two-hour block of just writing and content: structure, grammar and vocabulary. Using elements from the IPA model (the Interpretive Guide for example) and always including speaking and writing components has not only helped the trajectory of our department in assessing for communication, but it also helps produce a more accurate grade in student performance and content knowledge.
As I mentioned throughout the post, I need goals and deadlines. In fact, my life is full of them and I do live from deadline to deadline. So I will make this public goal for posting for second semester: I will post once a month. I will also try to remind myself that a post does not need to be three pages long either. 🙂 Have a great semester and happy assessing!