Friday, March 09, 2007

Dan King

Link to Screencast

Dan King: Kind of end up being a feed for this session.

This talk is not exactly what my abstract had.

Well normally when you submit an abstract that's relatively close to the talk you expect it to be pretty similar.

But as I was working through just that information from the environmental chemistry course, which is the focus of the abstract. It didn't seem like it was telling enough of the story and I started to look at some of the results from using the same two technologies in a different course and comparing the results from those two courses and that actually seems more interesting.

So, what I'm going to focus on here is looking at technological use inside the classroom and outside the classroom and the relative effectiveness of each.

So, I'm going to focus on two different courses, the first one is chemistry in the environment, an environmental chemistry course, that's the one that's actually mentioned in the abstract. So I'll be talking about data from two different years. This course is taught in the fall, enrolment this past fall was 40 students. The previous fall was 25 students.

The student population for that course are upper level under-graduates and first year graduate students, the primary population the course is designed for environmental science and environmental engineering students. It's required for those two majors. It also has a bunch of other majors and part of that accounts for the fifteen student jump come 2005 to 2006.

Suddenly we're starting to get a bunch of other students, chem majors, some bio majors, civil engineering, chemical engineering, some random students, [inaudible]

The other course I'm going to talk about, a very different course, physical chemistry, junior level class, enrolment about 34 students. I'm only going to give data from that class from one year because that's actually talking their winter term, so I'll do that data from last year, the next year is meeting right now, and in fact I'm missing class right now.

The student population in that class, all under-graduates, chemistry and chemical engineering students. One more comment about the environmental chemistry, the graduate and the under-graduate course actually meet together so they have slightly different requirements, it's a different course number but they do actually meet in the same class room for financial reasons.

Most of the graduate students are working full time so they're students, they're working at a company and they're coming back and getting their masters. So, their chemistry is somewhat suspect, at least they haven't had it for twenty years.

So, discussing two primary technological tools, the first one is in-class and that's the personal response devices or clickers. We'll come to those in just a second. The way I use them in class each student gets assigned a specific clicker, you'll notice on the back there is a number, and so when the students come in there is a list and they find their name on the list and they know what clicker to take.

My clicker questions are integrated throughout the lecture and as the students submit their response it is recorded on my computer and I have those data to work with at different times. I use the same way in both the physical chemistry and the environmental chemistry class.

Outside of class, the technological tool I'll address here is the use of a discussion board on the course management system. We use web-ct so the discussion board is accessible to web-ct which is where the course home page resides.

I encourage them to post anything of interest on the discussion board, it generally slips to just to a discussion of the weekly homework problems. Every once in a while a student will get brave and post some random comment and there will be a little bit of discussion about that. Realistically, the most part is "I'm having a problem, I don't understand how to do number four. Can someone help me out?"

So there are two primary clicker types, IR and RF, InfraRed versus Radio-Frequency. One big difference is the cost, IR are much cheaper, but the RF ones work much better for larger classes.

The IRs require a line of sight, so they work like your TV remote. So each receiver, which receives the signal, can only handle about thirty to eighty clickers per receiver depending on which brand that you purchase. The RF ones, radio-frequency, work more like your garage door opener, you don't need line of sight, and most of those on the market can handle up to a thousand per receiver. When you start submitting your results those all come to a little receiver that is plugged into a USB port on my computer. So, very portable.

Lots of different brands, some publishers have specific preferences that they bundle with textbooks. My choice is turning technologies, I chose that for several different reasons.

So, first question, "Have you ever used clickers in the classroom?"

What you need to do here is just press the number associated with the answer. Either one or two you don't have to click go or login. Just hit the number and it should record. What you see here is I can track how many students are responding, and if I wanted to look at who specifically has responded I can do it that way as well. I can pull up a little grid, so if anyone were to change their answer their box would change color.

So, if anyone wants to change their answer, you can change it back. If I use the grid in class, the first time I do this the colors just go back and forth as students love to play with the colors. Usually after the first few times they tire of that then they just go back to answering the questions.

So, once I'm convinced that everyone, or at least most people have submitted their answer, then I usually have a little ball drop that tells them they've go ten seconds left to submit their response. Then what shows up is a graph, totally anonymous to the other students, none of you now know what four or so of you have used clickers in the classroom, so it gives an anonymity to the students.

I know who has responded what, the devices are recorded, each clicker is assigned to a specific student, so if I want to go back and look at a specific student I can get that information.

So, clicker logistics, where do you get the clickers? Two primary ways, one, you can have the students to purchase them, a lot of textbook companies will bundle them with the textbook, or you can purchase a set for your self. That's what I did, I purchased my own set. The physics department here has purchased the same ones but they bundled it with their system.

So, I purchased it for my own for a couple of different reason, one, because I had the money to do it, and another reason was that I teach one lecture out of a multi-lecture course, all with common-grading. The other lecturers are not using the clickers. I do not feel comfortable asking my students to pay and extra fee to buy just because they happened to sign up for my lecture section.

It's a common course, it's not a separate course, common grading, common exams, everything else. The only thing that would have been different was that they happened to have a free spot in their schedule at eleven o'clock, so I didn't feel right asking them to spend extra money. So what I do is distribute and collect the clickers every day in each class, obviously if the students have their own they can just bring it.

Some of the other things you'll need are a computer and a projector. You need a computer to collect the responses and you need a projector if you want to display it immediately. If you don't want to display it immediately you could just have a computer that collects the data and just write the relevant responses on an overhead. If you wanted to combine that technology you'd have a computer that just collects the data, and then you wouldn't need a projector for that. Then you just look at the screen and then I could have just written on the transcribe screen 79 percent no, 21 percent yes.

In the physical chemistry course I usually have about two questions per 50-minute lecture. In the environmental chemistry course about four quicker questions per three hour lecture although it's really more of a three hour class, I don't actually lecture straight through, they have group activities in the middle, but that's a whole different topic.

So there are, I don't just do these for random purposes or specific learning objectives associated with using the clickers. One of the learning objectives is to test prior knowledge. So I will ask a question at the beginning of class. For example, I may put a question like this for my physical chemistry class. The correct answer here was actually number four. This is information the student should have had from their freshman chemistry class, so I'm really getting a feeling for what do they remember.

So, not only does it tell me that OK two-thirds of my students actually remember some of their general chemistry, but depending on what other options I give them, I can get some additional information. The fact that nobody chose number three means that I don't have to review the fact that to get this answer you don't just add up the numbers in front of each chemical. So that's important information for me, and I can use that as well.

Another learning objective is to test student learning, student understanding of the concept that I've just introduced, so for example, I may ask a question like this, this actually comes from the environmental chemistry class. So this is a question that I ask at the beginning, the correct answer is number two. Only about half of the students knew the answer, and a wide range of responses that I present the lesson, and in the theory at the end of the lesson, if I've done an effective job then everyone should know the answer. I feel lucky here actually most of the students actually got the correct answer at the end.

So, in addition to giving me some information, and often times you will find that this is a very humbling experience, you think you've explained something so eloquently and then you put it back up and you get that first distribution where you get only 50 percent got it right, even though you'll ask if there are any questions, and they say no, yes we understand you perfectly.

But also, this lets the ten percent of students who got it wrong know that they are not up to speed with the rest of their classmates. So the hope is that not only is it information for me, but it's information for the students. Sometimes when they all get it wrong hopefully that's a wakeup call for the students, oh I don't understand that as well as I thought it did.

Man 1: Do they know that you've been tracking individual things?

Dan: I don't mention it explicitly, but if they ask I tell them that I do collect that information although I've never used it and I never, you know it never gets displayed to anyone, but so I don't make a specific announcement, but if they ask, I do tell them.

OK, so we can one of the ways that I've used the clicker is as an assessment tool. Are the students retaining the information? So, here's some exam results from the environmental chemistry class. These are five questions from the mid-term exam. You can see the corresponding percent of how many students got that question right during class.

And then the corresponding question from the mid-term almost all the questions they did significantly better. Number three, they actually did a little bit worse although statistically I'm not sure if that's any different. Number five, I didn't ask a corresponding question in class and it turns out I didn't need to because they all pretty much knew that information. If we looked at the final we pretty much see similar results. So again, not always to 100 percent. So, number two, they didn't do well in class, they did better on the exam but still not perfect. But it does give me a measure of student learning. So it's not only a tool for getting students engaged in the classroom, but I can use it as an assessment tool.

Man 2: Is that the same exact question?

Dan: Sometimes it's the same exact question, and sometimes it's just similar content. In physical chemistry class, I think with this with one actually with physical chemistry, it was the exact same question.

And here you know they really remember those questions, but I'm still not sure what happened with number three [laughing]. So, they got in class, and they lost it for the exam. It's an outlier. So, but you know the clickers won't solve all your problems, but they give you a way to measure your effectiveness. The one thing that I wanted to look at was that could I come up with a correlation between students using the clickers in class, in the classroom, and how they did on the final grade. So what we see here is the y-axis shows the percent of clicker question students answered during the class. Not whether they got it right or wrong, just how many-what percent of students, or what fraction of questions did they answer over the course of the term. As you can see for the physical chemistry class, the reasonable correlation pretty much no correlation with the environmental chemistry since. So, whatever was affecting their final grade, it was not whether or not they used the clickers in the classroom.

Woman 1: I'm curious why just did response rather that success of answer.

Dan: Oh, it's just a matter of you know, time to work out work through the data, so that is an eventual goal to look through that. But some preliminary results I've done a little bit with the physical chemistry class, and I've seen, it's a similar correlation, but not necessarily a huge one.

Woman 1: You'd actually have a Cordeau correlation right? That they learn from the clickers, but cause otherwise it would be pre-knowledge. They already knew it so now they know.

Dan: Right, well I guess it depends on how you set up the correlation. Is it, you know how many students improved, so you can plot it in many different ways. So, it's not always just questions right, it's how many students got it wrong versus how many got it right on the exam. So you can set up little tables to look at that. At our end of term surveys I put some questions on our course evaluations asking students what they thought. So, these were all questions on the end of term evaluations. Clickers, you know maybe more likely in class, more likely to participate, maybe more focused on the lecture helped to improve my understanding, or made attending class more fun. Generally we saw that while physical chemistry students seemed to feel they got more out of using the clickers than the environmental chemistry students which sort of matches with the positive correlation we saw with students actually responding in physical chemistry versus environmental chemistry.

Man 3: Now that more likely to attend class I mean did you compare with the measured. Cause you know right?

Dan: I have not gone back and looked specifically at, well there's now way for me... I can only do percentages and that's a tough thing to measure because more likely to attend class, you know, how do you actually measure whether their likeliness to attend class. So the one thing that I have done is for both these classes, there's a small component of using the clickers in their grade. So, a small part of their participation grade comes from answering seventy-five percent of the questions. Whether they get it right or wrong, it just, if they answer 75 percent of the questions they get a small percentage of their grade.

Woman 2: Did you find out from students how they felt about these new [inaudible]

Dan: In general, those responses are actually a pretty good measure of older students versus younger students because the environmental chemistry class is two-thirds to three-quarters to graduate students. Ok, We're going to talk next about the discussion board, so please answer.

[silence]

Dan: Ok, so one of the nice things about using the clickers is, because I have this information, I see that 94 percent of you have used some type of course management system. I can now skip these slides.

[laughter]

Dan: Because those were slides to just show you my course website looks like, but since you all use those, you don't need to see that. So the only thing I'm going to show of those slides are the slides that just show you the listing of one week's discussion topics. What you can see, that there are several where there are only a single posting and several have multiple responses. What I'm going to do is look at the average number of postings, the average number of threads, and postings per thread in each class.

Dan: So, for the physical chemistry class, the students are required to post three times over the course of the term, a very minimal posting requirement. The first year that I did it, I did not require anything, and no one posted. It was a desert. So I figured, let me just do something to at least get them started and then hopefully it'll take off, and it turned out for the physical chemistry, it did not really take off at all, so you can see... Average postings per week 11. There are 34 students in the class. So average threads per week 5. Average postings per thread 2. So pretty much, one person posted a question, another person posted an answer, and that was it. So not really much of an extended thing. Average posting per student 3. Course requirement 3.

[laughter]

Dan: However, despite the fact that they spent very little time and did the minimum required to post, they read everybody's post. 104 readings. Now, this isn't broken down, so some of that 104 are people re-reading the same post over and over again. I haven't gone through that would be extremely time intensive for me to look at each student, how many time they read.

Man 1: What's the total number of postings?

Dan: That's not a number I have.

[silence]

For the environmental chemistry student, similar requirement: basically three postings for the undergraduate, which mirrors, the physical chemistry, five postings for the graduate student, part of the differentiation between the two courses. So, a lot more postings for the environmental chemistry students. The two numbers here are just 2005 versus 2006. You see similar results between the two years. And actually, keep in mind that the average number of postings per week - 34. There's actually 10 fewer students in 2005 than there were in the P. Chem. Class. A lot more different threads, where a thread is each different topic. The threads going for 5, questions in initial posting and four responses. If you look at the results on a student level, average postings per students, we're up to 10. If you break that down, all the undergraduates are still sticking with their required minimum, but the graduate students generally much higher than their minimum, so graduate students really seem to appreciate this technological tool.

Woman 1: Were your undergraduates [inaudible]

Dan: Oh yeah. It was a common discussion board, so they were all reading each others. So again, students reading a lot more than they're posting. So they're not just going in, posting and then getting... So even though the undergraduates are only posting their minimum required, they're also still generally reading and trying to get some information. They're just doing their little amount of initial intellectual thought required. The requirement for the postings was that it had to actually be a complete thought. They couldn't just post, "I agree with so-and-so."

I looked at the correlation between number of postings to the discussion board and the course grade. You see somewhat of a weaker correlation with the physical chemistry, which you would expect since they're not doing it as much, but its still somewhat of an indication of the students that are actively engaged in the material. A little bit better correlation for the environmental chemistry students than we saw for the clicker responses, but still not a great correlation.

I also looked at a correlation between students clicking in class and posting of those students engaged. There's absolutely no correlation in the environmental chemistry, and an in-between correlation between the two, as you might expect.

One way that I use to assess how the students felt about these two technological tools was an online survey known as the SALG Students Assessment Learning Games so it's a survey designed to ask the students to identify the effectiveness of various course aspects on their learning. It's online, its customizable, you can change the questions. It's very accessible, very easy to use. All the questions are set up on a Likert scale, five choices from 'strongly disagree' you know, 'no help' to 'a lot of help', number five.

So I had a list of six different course components for the environmental chemistry class these are the six. I really just want to focus on the two in red: the clickers and the discussion board. What you can see is that the environmental chemistry students thought that the discussion board was really effective. The percentages of the number of percent of students that either four or five they either thought it was helpful or very helpful. And clickers, not as helpful. Not a big surprise. We kind of saw that in the data.

The physical chemistry students same list of six topics. But they really liked the clickers, as we saw from the course evaluation, they did not think the discussion board helped them much at all. Despite the fact that they were reading a lot, they weren't posting and they thought, "Eh, helped a little but not very much."

Student comments: Some of them thought... I'll kind of let you scan through there. "Thought the clickers were beneficial." "Discussion board great." "Discussion board really helpful." This one I really like, "Clickers helped give confidence in class." That's one of the things we hoped to see. Surely that's not the case for all students, but it is one of the hopes of using the clickers, that students who ordinarily wouldn't raise their hand and participate in class, as they start to click and get questions right, then maybe they start to get a little bit more confident. And I do see that to some degree, anecdotally.

So in summary, both clickers and discussion board increased student engagement. Clickers primary used as an assessment tool for student learning, not necessarily a good predictor for student grades, although somewhat of a correlation in physical chemistry class.

Here the clickers were more helpful for student learning in physical chemistry class, as opposed to the discussion board. However, the discussion board seemed to be more effective for the environmental chemistry students. And I think that was in large part due to the fact that you had part-time students, and for the environmental chemistry students, since most of those students are off-campus, the only interaction they had with other students was through the discussion board.

The undergraduates can generally just meet up with each other at any time. They don't need that technological tool to interact.

And that's it.

[clapping]

Dan: Any questions? Yeah.

Woman 1: Just by the names, I would assume that the physical chemists might be more introverted than the environmental chemists. Do you think that might be true?

Dan: I would expect that maybe for the environmental science component of the environmental chemistry class, and some of the environmental policy students that are there. Not necessarily the case for the environmental engineering students in that class, the chemical engineering students in that class. Some of the chemistry students and the chemical engineering were actually in both classes.

So maybe, on an average, slightly more so. But because the environmental chemistry class is so mixed, I don't know that it's specifically...

Woman 1: So, not enough to account for the difference between the discussion boards, which I would think extroverts would like more, and the clickers, which I would think introverts would like more.

Dan: Yeah. I would say probably not a big enough population difference to have spawned that.

Man 2: Are you going to change the way you teach based on the data?

Dan: To some degree, yes. I would say one thing that I can take from this is, if I really want to use the clickers in the environmental chemistry class, then I need to do a better job of that. Because obviously they don't feel that they're getting much effectiveness out of it.

Certainly, on a day to day basis, I do a lot of real time adjusting, just as I did here. I was able to skip those slides. So one of the things that I do is, if I ask a question at the beginning and I'm trying to assess their prior knowledge, then I will sometimes put some slides that I might need. So sometimes I'll ask a question where I'm hoping that they have some prior knowledge before I discuss the next topic. And if they don't have that, then I know I need to review this topic before I go on to the next topic. Otherwise they're going to be lost and I'm just talking to myself for an hour.

Woman 2: Where I work, some people use these for attendance in large classes. And it actually made me start wondering. Your physical chemistry students were all undergrads, right?

Dan: Mm-hm.

Woman 2: Where there was a higher correlation between using the clickers and getting a good grade. Could it possibly have represented students who were actually in class more?

Dan: Oh, absolutely.

Woman 2: OK. I was wondering if attendance was required, or not necessarily.

Dan: Well, the requirement is implicit, not explicit. So they get a small participation grade for answering 75 percent of the questions.

Woman 2: OK. So in fact, it might represent students who are better prepared there.

Dan: Right. Although let me give you one caveat to that.

Woman 2: OK.

Dan: I also used these in my general chemistry class. We have a regular general chemistry sequence, and then we have a trailer sequence for the students that have failed one, or had to take a prep temp course, or transferred in. They're in there for a variety of different reasons. So in the general chemistry, we have multiple sections, so I can't require attendance, because the other lecturers have no way of recording that. So there's no attendance requirement at all for the lectures. Students attend if they want to.

In the trailer course, there's just one lecture. So last year we put, five percent of their course grade is attendance, and you get those points if you answer 75 percent of the questions. You don't have to get them right. You just have to be there and buzz in. So the thought was, oh, I'm going to get all these students in class, and they're going to be engaged, and they're going to do much better. It didn't quite work out that way.

One of the problems that I had was, I had students that were in the class to get their five points. And those were students that actually became somewhat of a disruptive influence, because they didn't want to be there. And ordinarily they wouldn't have been there, and I would have just had the students in class who were paying attention and wanted to be there. So I ended up bringing in some students who shouldn't really have been in the class, and it created a more difficult lecture environment.

And there's also a concern that the information that I get, which is really valuable about do the students understand the information. To some degree that's skewed if you have a bunch of students who are just randomly buzzing in every time they see a question come up, and then going back to their conversation. Because I could think that most of the class doesn't understand what I've just explained, when actually the students that are paying attention do understand. And it was just students that were randomly pressing buttons that skewed, that just pressed number five.

Woman 2: That sounds like a horrible story.

Dan: Well, it's one of the realities of it. So I guess one of the take out messages is that clickers will not solve your problems. They are a tool, as anything else is, and they can solve some aspects and help you do some things better. But just having the clickers is not going to turn your classroom into a wonderful, happy place.

Woman 2: Right. But it just sounds like the students who don't want to come to class and might just come to class for the points, won't really bother to understand the questions and might not even bother to read the exercises. These don't sound like very successful students.

Dan: No, they're not. And you can't make them successful just because you've put a piece of technology in their hands.

Man 3: Have you thought of using the data you collect for mid-semester? Just to see if there are specific students that are always getting questions wrong?

Dan: In my dream world, yes. I have enough time that I can go through the data and I can identify the students that are struggling and call them into my office and say, listen, I noticed that you're struggling, and you're getting all the clicker questions... Right, I wish there was some way I could do that. But this term, I'm teaching two of the general chemistry lecture sections. I have close to 400 students in my class. For me to wade through all of those numbers. Unfortunately, I just don't have the time.

But also, with the 10 week term, it gets very difficult to get that done. But that is my dream to eventually get to the point where I can do that.

Man 3: Do you think that's a shortcoming of the client software? Because I've noticed that the Turning Point software generates really impressive statistics for a single class. It doesn't really seem to have any way to aggregate [inaudible].

Dan: There actually is a way to generate report summaries, where I think you can actually combine them. I have not. Because when I go to generate my reports, I see those options, and I just haven't had a chance to play with it. But I think that that actually is built in, where you can get kind of get some of that aggregate information.

Man 3: The newest generation of their software creates much better reports. [inaudible] I guess you can just go onto Turning and download it.

Dan: Yeah, you can always get the software for free. One of the reasons I went with Turning technology is that their software, in addition to being able to integrate right into PowerPoint, we saw that those data and graphs showed right up. And when I save this file, it's in that file. And in addition to this, this software can use other companies' clickers. So I thought it was a very powerful software tool.

[applause]

Man 4: Do you want these back?

Dan: Yes, I do want those back. [laughs]

Man 4: I thought that you might.

0 Comments:

Post a Comment

<< Home