Last Thursday, I had a really helpful meeting with my dissertation advisors. I had been floundering a bit before the meeting, trying to wrap my head around this dissertation project and what it might look like. Particularly, I wasn’t sure how I was going to build this corpus of blogs. I had worked for a while (longer than necessary, probably) writing some code that would take a txt file of blog URLs, scrape the text from the posts that appeared on the first page of each blog, and then save each blog’s text in a separate txt file. For someone like me who’s still relatively new to programming, writing this code was a feat! But I had concerns. I wasn’t super confident about my method for selecting the blogs. I had basically settled on using Blogger’s Next Blog feature, but I learned that in 2009, the feature stopped taking users to random blogs and instead used some sort of algorithm to return blogs on topics that would more likely be interesting to the user. Not what I needed for this study. I also contacted WordPress.com to see if there was a way that I could systematically gather a list of personal blogs, but the response made it clear that doing so was not possible the way I wanted to.
So I came to the meeting not totally sure how to go about collecting my data. After talking with Jo and Bethany, we decided on taking an approach to corpus collection that mirrors more closely Bethany’s method–that is, working with a smaller, more controlled corpus rather than trying to gather a sample of texts that is as large as possible. We also discussed the idea that I can develop a survey (basically a language attitudes survey) and administer it to the authors of the blogs I’ll include in my corpus. I love this idea because, assuming people actually respond to my survey, it will mean that I will rely much less on assumptions about what influences these author’s linguistic choices and more on actual data that they provide.
Throughout this semester, we’ve discussed a lot of different ways we can use computers in language learning and other applied-linguistics tasks. One of the technologies that’s come up in this discussion is automated writing evaluation (AWE) tools. Through my assistanship this semester with Elena Cotos, I’ve had the chance to read some of the literature on automated writing evaluation (AWE) systems, and I’ve been able to think about some of the limitations and affordances that AWE tools offer. In this post, I want to synthesize some of my thoughts to see if I can get them somewhat organized.
First the strengths of AWE tools. Anyone who has taught a writing class knows that grading dozens of papers is time consuming at best and painful at worst. Having a computer take care of that grading—or at least a sizable chunk of it—could allow teachers to focus their time on other activities that could also benefit their students more than pointing out grammatical error after grammatrcal error. It could allow a teacher to spend more time giving more holistic assessment or planning lessons that address specific issues a certain class struggles with. From a writer’s perspective, I can also see some of the advantages to these tools as well. It can be really valuable to get immediate feedback on something I’m writing, and all AWE tools provide this immediate feedback.
However, despite these strengths, the weaknesses of AWE tools are still, in my opinion, pretty significant. As a writing teacher, I try to help my students understand an assignment, identify who it is they’re writing for, and write in a way that will be clear and convincing to that audience. If I know that a machine is going to evaluate these students’ writing assignments, I’m almost forced to teach them to write for this machine and not to an audience who might actually be interested in what they have to say. The rhetorician in me cringes at that thought. Writing is inherently communicative. We write to communicate ideas form one human to another, not just to get a grade or to satisfy the requirements encoded in some machine algorithm.
I’ve mentioned that I can see the value in getting immediate feedback, but that value only comes after the evaluation tool has earned a writer’s trust. We’ve all experienced MS Word giving us bad advice, telling us to avoid a passive or that a fragment is problematic when in fact it’s not. It takes skill to interpret a writing tool’s guidance—to know when to listen and when to ignore—and I’m not sure that most students would be able to do that very skillfully. With practice, they might, but I think the risks are pretty heavy.
I don’t think anyone is advocating for AWE tools to fully overtake human-evaluated writing. And that’s a good thing. But in my view, we’ve still got a long way to go before I’ll be comfortable trusting an AWE tool to evaluate my or my students’ writing.
I’ve been designing books for a few years–ever since I took an intro to InDesign course as an undergraduate student. I was surprised how much I enjoyed that kind of work. I still look for opportunities to put my book-making skills to the test whenever I can. So being assigned to create an iBook for our final project in our 510 class was something I was really looking forward to.
I’ve never used iBooks before now, but I’ve found it to be really user-friendly. Though, I should qualify that a bit: coming from a program like InDesign makes some of the limitations of iBooks more pronounced. For instance, I’m surprised that I can’t include a guideline on a master page in iBooks and have it show up on the working pages. In our group’s iBook, we want to be sure that all the content appears on the same part of the screen, and the only way to do that effectively that I could find was to place a brightly-colored square on the master page to indicate where content should go with the plan to delete the square once we’ve got all our content placed.
Even though iBooks has its limitations, I’ve been impressed with how easy it is to use. We’ve been able to include a lot of functionality through its built-in widgets (like the assessment widgets) and through third-party widgets as well. Using Bookry’s library of widgets has been really easy, and it’s given us the option to do things in our iBook that we wouldn’t have been able to do otherwise. For example, including areas where students can insert their own text is something that, as far as I can tell, is only an option in Bookry’s Form Builder widget.
We still have a bunch of work to do on the iBook, but it’s been a good experience so far. It’s been great to work with a group with so much expertise in langauge-learning pedagogy and design. I think we’ll end up with a book we’ll all be proud to show off.
This week, we investigated a few social media sites for language teaching and learning. In this post, I will quickly share one of the most important things I learned while doing this investigation: Give the user valuable content where they are now—directly on the social platform. Don’t make users click unnecessarily to find the content you’re trying to deliver.
This played out in the two accounts we investigated on Twitter. One (@slowgerman) did a fine job of providing useful content on Twitter, while the other (@radiolingua) did not. Admittedly, providing useful content on Twitter can be a challenge because the number of characters per post (tweet) is so limited. The strategy of @radiolingua seems to be to provide content and products for sale on a website while using its Twitter account to just link to that content. This is OK, but it’s not nearly as effective, in my opinion, as finding a way to share useful teaching and learning content directly with users on the platform they find themselves.
The Twitter account for @slowgerman seemed to get around this problem nicely. This organization uses its Twitter account mostly for providing information that helps users learn German. Occasionally, they’ll tweet a link to some resource or information on their main website, but their followers are primarily getting useful information that helps users learn German vocabulary.
So that’s the main takeaway: Do everything you can to provide useful content directly on the social media site your audience is already using. Don’t force them to navigate away from it to find the content they want.
The most recent assignment we completed for our English 510 class was a multimedia tutorial. Our task was to select an online language-learning tool, investigate its features, and create a narrated tutorial that briefly describes the tool and how it can be used in a language learning context.
I chose Quizlet for my tutorial. While I was familiar with the concept of digital flashcards, I had never used them before. So I was glad to have the chance to get familiar with Quizlet and think about how I might use it myself.
Here’s how my tutorial ended up:
I created my tutorial using Camtasia. This was my first time using Camtasia, and I was impressed with it overall. I found it easy to learn and feature rich. When I’ve needed to create screencasts before, I’ve used QuickTime. QuickTime has the bare minimum in terms of capability, so it’s incredibly easy to use, but it doesn’t offer a lot of tools to help in postproduction. Not so with Camtasia. It was fun to play with different zooms, transitions, and call-outs as I edited my screencast footage, and I think it definitely made the final product seem more polished and user-friendly. Doing all that postproduction was time consuming, though, so I imagine if I were in a situation where I had to, say, provide video feedback for all students in a writing class, I’d likely stick with QuickTime and not worry about doing any postproduction.
As of now, my video has five whole views! It’s likely that all of those views are mine and Jim’s, but hopefully a few more people might stumble on it and find something in it that’s useful.
It’s been exactly a decade since the last foreign language class I took. I was a student on a study abroad in Berlin taking German courses at the Goethe Institut on Neue Schönhauser Strasse. My time in Berlin was unique. I ended up getting sick, being hospitalized, and missing most of my coursework. I had to make up one of my classes and withdraw from another. But I still got to live in a different place where people spoke a language different from my own. And that was a very valuable experience for me.
Because of my lack of classtime—and because my time as a foreign language student was so long ago—I haven’t been exposed to many technologies foreign-language teachers use these days. When I was in my language courses, we were still largely relying on printed pieces of paper with cartoon dialogues. Sometimes we’d have to go to a computer lab and use one of the old colorful iMacs to work through a multimedia module. I’m sure these would seem pretty primitive by today’s standards. Most of the time, we’d just practice speaking to each other and to our teacher.
When I was in Berlin, I tried to use my German to get around, but I never felt confident in it. I wonder if I were to study German now if things would be any different. I wonder if I were to use some of the technologies we’ve discussed in class if it would help me feel like I could communicate better or give me more authentic practice. I have to think it would.