jeudi 19 mars 2015

In Search of the Hypertext Gloss

Introduction
As a language teacher working in a fairly fast-paced prep school, I often found myself searching for tech tools in the interstitial moments between classes, after finishing a big stack of grading, or over my morning coffee before I had decided what I was going to do with my lesson plans for the day. In other words, my initial search wasn’t exactly thorough. During that time, I found and implemented a number of tech tools to great effect – more than one student remarked to me unsolicited that I was the teacher at the school who most effectively used tech. But, of all the heavy lifting that I was able to delegate to technology one type of task dogged me: creating hypertext glosses. I had read about the efficacy of hypertext glosses in a growing body of literature (covered below), but none of the software or web tools available seemed to get it right. My Microsoft Office suite was useless, the tech mentioned in the journal articles had remained on the university computers it was tested on, and even when I could get a hold of permission to use some of these experimental programs they didn’t work correctly. I re-entered graduate school believing that someone was going to make a lot of money by developing a working hypertext gloss and sweeping the market.
Now that I’ve had time to dig deeper on this issue, broadened my perspective on Tech and SLA, and changed a few of my search keywords, I now realize that “sweeping the market” probably isn’t as easy as it seemed to me. A number of tools do exist and many more of them are publicly available. To fill my knowledge gap, I have tested ten good candidates for hypertext glossing, spending about 45 minutes to an hour with each. Using my own feature wishlist as a guide, I tried to get these tools to perform tasks that I knew I would want them to be able to do in my own classroom, creating teacher and student accounts with different email addresses. I solicited feedback from my classmates using a short open-ended survey about what they would like to see in hypertext glosses and added the novel responses to my own feature wishlist. The end result was a chart with ten tools tested for eighteen important criteria. Where all the tools had the same feature or feature deficit, I removed the criterion from the list. My ultimate goal is to help teachers make an informed decision about which tool to use from among what is practically available to them, not to gripe about tech that doesn’t exist yet.
            With this preliminary work complete, I have prepared a qualitative description of each of the ten tools I tested. The learning task for which I want to prepare is the cultivation of an active L2 reading group within about a week of preliminary setup and familiarization, followed a full quarter of active L2 reading using hypertext glosses. The students I have in mind are intermediate to advanced L2 students (B2-C1) in either secondary school or higher education. The examination of existing tools is crucial not because it will allow us to choose the “best” tool – though some tools clearly outperform others on similar measures – but because our choice of gloss will have an impact on the kind of community of readers that we create.
Research on L2 reading in the field of computer-assisted language learning provides us with some important goals and signposts. According to Chun and Grace (1998) for intermediate and advanced learners, we should provide L2 glosses. While students tend to prefer textual glosses exclusively, textual and visual glosses improve comprehension and recall (Lomicka, 1998), so we should favor tools that facilitate the use of multimedia. Including traditional activities along with reading helps to activate prior knowledge and improve reading comprehension, which suggests that tools that include activities such as reading questions are also desirable.
The criteria on which I have tested the tools on the list can be grouped into the general categories of ease of use, accessibility, and gloss capabilities. You can download the full-sized version here.



Tools

Ponder
The first tool on my list of for analysis was Ponder, a social reading platform with a lot to offer teachers looking for quick feedback. It’s clear that this product was created with classroom use in mind, though probably not SLA. What Ponder lacks in depth it gains in speed and integration. Using the Ponder browser plug-in, you can annotate any text on the web and ask students to categorize parts of the text into predefined groups. Creating annotations is as simple as highlighting the text and clicking the purple “P” that pops up next to the highlighted portion. From there, you’ll be prompted to click a quick response (these are available in English, Gaelic, Arabic, and Spanish), type in a custom comment, or categorize the highlighted portion as an instance of a predefined group. For instance, I created a category called “Verbs” that prompts the student to categorize highlighted portions as an instance of any number of tenses, modes, and aspects, e.g. (“je marchais” - past, indicative, imperfect).
Ideally, I’d like to be able to turn off the reaction buttons or at least be able to edit them into a form I like beter – I’d like my students to get more out of the text than “Awesome!” and “Really?”. On the technical side, the minimalist settings menu might keep students from fiddling too much with the plug-in, but it also keeps me from figuring out why it isn’t working. Sometimes, I highlight text and the “P” fails to appear, meaning that glossing ability is suddenly just... gone. Ultimately, by handing more control over button text to instructors, this tool could become even more powerful.

Hypothes.is
Originally designed for purposes other than classroom, Hypothes.is offers us the much-needed capability to create a mark-up of any webpage on the internet, engaging in discussions with those who have also downloaded the Hypothes.is browser plug-in. During my initial investigations, I discovered that the Hypothes.is community seems primarily interested in science and politics, rather than glossing L2 texts. Aside from the superficial question of what everyone on Hypothes.is is talking about, emerges the very real issue of who can see what my students are talking about, which in the case of Hypothes.is, is everyone on the internet. To underscore this fact, Hypothes.is even placed a little warning text underneath the comment button which reads “Annotations can be freely used by anyone for any purpose.” As someone who works with minors and frequently asks them to write their intimate thoughts on some of the most controversial issues of the day, the possibility of one of my students getting quoted out of context makes me more than a little uneasy.
Beyond this concern for “reasonable” efforts to protect the privacy of my students, I find Hypothes.is very straightforward. Without the bells and whistles of some of the other products I tested, Hypothes.is has a fairly gentle learning curve: install the plug-in, highlight a work, and click the little pen that pops up. Once the feed has been started, it’s a snap to respond to someone else looking at the same page, making discussions the strong suite of Hypothes.is. For an advanced, discussion-based SLA class of students with their own laptops, Hypothes.is could be the way to go, especially if you aren’t very interested in keeping track of whether everyone reads or contributes.

eComma
In a word, tools like eComma break my heart. Here we have a wonderful and innovative gloss tool with almost everything most SLA teachers could want, and yet, you’ll probably have to bake a cake for the entire IT department at your school or university to get them to help you set it up. eComma relies on the Drupal server system and on local tech support personnel who 1) know how to use it, and 2) can devote time to setting it up and maintaining yet another server module specifically for you.
As if this weren’t enough, however, in terms of desirable features for SLA – features which have been shown to aid comprehension – eComma is also a little behind the curve without multimedia support. Besides Tiara, eComma is the only other tool I tested that was specifically designed for L2 learning, which is shame considering its aloofness.

Kindle Notes
In an effort to be inclusive of what is perhaps the most widely available digital annotation tool in the world, I have included Kindle Notes on my list. Initially, there was quite a bit of excitement about Amazon’s decision to move Kindle annotations from the private sphere to their public website. Bloggers raved about the possibility of, say, reading George Bush’s memoirs annotated by Donald Rumsfeld. However, the actual implementation of the public notes feature was so under-publicized and sloppy, however, that today very few people know that they can publish notes for the whole world to see. Thus, much like Hypothes.is, Kindle Notes is an all-or-nothing annotation solution that doesn’t leave us much room to negotiate the privacy of students.
Unhappily for Amazon, the problems with using Public Notes in the classroom don’t stop there. As Ruth Franklin points out “In order to view Public Notes, according to Amazon’s FAQ, you have to be using the latest Kindle model, running the most recent version of the software.” The tool itself remains clunky and resistant to improvement, even when Amazon doesn’t have to do it themselves. When another service, Findings, created a plug-in that turned off the conservative sharing settings on the Kindle to allow users to quickly make all their annotations public, Amazon promptly had them served with a don’t-make-us-sue-you-cease-and-desist. No doubt frightened by the rumblings of the litigious giant, Findings ended up discontinuing its service and moved on to a new product, Instapaper, which has no Amazon sync. Ultimately, Kindle Notes fails as a classroom tool, but probably also as a tool in general. I predict that it won’t be around for very much longer.

Tiara
Like eComma, Tiara is the fruit of academic, rather than private-sector edtech, labor, and it actually designed for L2 annotations. Unlike eComma, Tiara works in your browser with no server setup required. Unique on my list, Tiara is the only tool I test at that is totally teacher-centered: students cannot gloss texts or comment on them. While many of the features of Tiara, such as its multimedia support, put it above some of the other offerings I tested, the website in which it is embedded is, to put it politely… a time capsule. The first time I tried to gloss a text with Tiara, I quit halfway through the process out of frustration with the clunky interface. With a little love, and probably a lot more grant money, Tiara has the potential to become a useable classroom tool. In its present condition, however, it will positively frighten your millennial students.

Genius.com
Perhaps the tool on this listed with the most unexpected origins, Genius.com caught me off guard with its ease of use and the obvious dedication of its user-base, a force not to be underestimated in the world of edtech. Genius.com began as Rapgenius.com, a forum for rap fans to annotate their favorite lyrics. Quickly, the need for those with first-hand knowledge of the meaning of the lyrics – including the artists themselves in some cases – generated the need for a system that gave greater privileges to those with savoir-faire. Rapgenius answered with a comment-karma and upvote system not unlike that of Reddit, and soon the annotation system had evolved into something that teachers wanted to use: Genius.com was born. While still in the early phases of its development as an educational tool, Genius.com gets a lot of basic annotation features exactly right. And who knows – telling its origin-story might even be enough to draw in a few students.

Subtext
            An iPad-only solution, Subtext certainly knows its market. After many months of offering free service, the company has announced that it will be moving from freemium to “premium only” at the cost of $3 a student. If your school happens to have iPads and wants to increase the cost of instruction for some reason, perhaps Subtext is the right tool. Subtext looks very nice on the iPad, and provides an attractive reading experience that many of the browser-only alternatives do not. Perhaps the most interesting feature was the ability to browse websites in the app and then convert them automatically into a distraction-free book format. For all its expensive features, however, Subtext also misses the mark on embedded media and ease of use.

eMargin
            As if seeking to prove the adage that “less is more” eMargin is designed to please those students who want essential information – by which students typically mean just text – quickly and without any hiccups. Compared to the browser plug-ins Ponder and Hypothes.is, eMargin manages to hold together a more stable interface without downloading anything on your computer. Just as slower page-load times on eCommerce have been directly correlated with fewer sales, I suspect that the student use of a web-based hypertext gloss may be negatively impacted by slower load times for repetitive operations. If you think your students will agree – eMargin might get you to trade multimedia capacity for stability and speed.

Google Docs
For the same reason that I addressed Kindle Notes, I think its important to talk about the feasibility of using Google Docs for hypertext glosses. Students and faculty are generally already familiar with the format, and the comment system allows for extensive, albeit crowded, annotations.
Many of what I believed to be limitations of Google Docs, upon closer examination turned out not to be non-existant. To start, there is indeed a way to add voice recordings to a Google Doc. Likewise, if you need a downloaded record of annotations, you can download Google Docs as Word documents with the comments as part of the markup layer. From there, you can print or save the documents. Alternatively, you can download a Google Doc as an HTML file to get the gloss as footnotes. However, comments are anonymized in both these processes.
            The main drawbacks of Google Docs were not things I could find a concise way to note in my feature wishlist: first, when multimedia content is embedded in the document, it shifts the text around it, and second, it is possible for students to “resolve” and thus dismiss each others’ comments, on purpose or accidentally. While the latter issue can be resolved by tracking through the document revision history, Google Docs is unique in that it is the only tool I studied which could potentially allow one participant in a discussion to silence another.

Curriculet
The strength of Curriculet is its built-in reader accountability mechanisms. Put another way, Curriculet operates on a paradigm that is entirely different from the other tools on this list, in that it allows teachers to ask direct questions and get responses from students. Student annotations are always private, however, so teachers looking to get an idea of the thought processes of their students will have to do it the old-fashioned way – by asking. In keeping with what I believe to be sound pedagogy, I personal prefer Curriculet’s feedback-centered model to the more inert annotation model of most of the tools on this list. Whereas with many of the other gloss tools listed here, I would simply assign a holistic grade to students who “sufficiently” glossed a text – and it would still be unclear what to do with the student who glossed last and had nothing to say – with Curriculet I can make sure that I get a built-in homework assignment out of the reading. Further, Curriculet’s data analytics mean that I can transfer this information directly into grades or simply into a diagnostic to guide my lessons. While Curriculet is not as stable or as fast as eMargin or as easy to use as Google Docs, it is the clear winner in terms of a hypertext gloss tool with the vast majority of features that I want as a teacher.

Conclusion
Given the variety and complexity of some of these tools, a reasonable question remains: is this worth it? Do these tools really outperform paper and pencil solutions to annotation? Why shouldn’t I simply gloss a text in Microsoft Word using the comments utility and then print the marginal notes? The much-maligned office copier has its faults, but it hasn’t ever deleted student work or forced me to call IT in the middle of an activity on Jules Verne. I believe that a quick summary look at the tools I’ve just described can give us a direct and irrefutable answer: these tools are worth it not just because they still have that brand-new smell – paper straight from an a copier smells pretty good too, if you ask some people. These tools are worth the trouble it takes to learn about them because hypertext glosses are an evidence-based improvement to L2 reading. Let’s review how they accomplish this: first, they provide more and different layers of information than a paper gloss; second, they can be hidden in order to minimize distractions; and third, they can be easily edited by the instructor in order to reflect revisions or to adapt to the needs and proclivities of different groups of students. While some of the tools we looked at grant us more affordances, these three are commonly shared by all hypertext glosses.
I hope that this final project is of practical help to instructors seeking to create a community of L2 readers as well of theoretical interest to those intrigued by the purposes and affordances of this exciting – albeit belated – technology.


samedi 14 février 2015

Teaching creative writing with Elegy for a Dead World

In my last post, I made reference to a number of games I consider pedagogical in nature, including Never Alone and Mass Effect. Today, I stumbled across my new favorite example of a pedagogical game: Elegy for a Dead World. The goal of Elegy is to serve as a scaffold for creative writing. Check out the trailer below to get an idea of this rich and beautiful concept piece.


Now, what are some of your favorite pedagogical games?

vendredi 13 février 2015

Gamifying the curriculum

The intersections between gaming and school as usual are a subject of continual fascination for me. In class this week, we talked about two possible intersections: bringing games into the classroom (my presentation) and using virtual worlds to bring the classroom into a game environment (Gabe's presentation). There is a third alternative as well: bringing the game paradigm into the classroom and allowing it to restructure the academic experience. 

Both parts of the preceding description are important to understand what I mean. The first part, "bringing the game paradigm into the classroom" makes a reference to the millennial practice of allowing cultural practices from the world of gaming to "leak" out into the non-virtual world. The second part of the description implies that we are not simply making reference to these cultural practices, but also enacting them, letting our conventional ideas of what learning "looks like" to be deeply influenced by all that we have learned from the gamer community of practice.

To be get a big-picture view, I've drawn a simple Venn diagram:

It is possible to conceive of virtual classrooms without game elements, video games without pedagogical elements, and gamified classrooms without virtual elements. Further, we can conceive of gamified virtual classrooms that combine all three elements. A project like Edorble (beta signup here) which is currently in the virtual classroom stage - there aren't any gamified elements in it yet - could move into the center of the diagram by adding some elements of the game paradigm.

Naturally, I would be remiss in my academic duties if I didn't complicate this picture a little bit. Because of the strong influence of gamer culture on virtual worlds, there should be quite a bit of purple bleeding into the red circle on the top. Most users of virtual worlds will expect a few of the affordances of video games: a menu may appear when I press "esc", a graphical overlay may give me information about the world, and some of the avatars with whom I interact may be non-player-characters (NPCs). Likewise, video games often engage in pedagogy. Never Alone teaches us about Inupiak culture, Mass Effect is a veritable primer in ethics and sociology, and the Bioshock Infinite is a parable about the confluence of religion, nationalism, and racism in American history. Such high-concept video games, while increasingly common, remain the exception rather than the rule. Finally, while gamified classrooms may offer alternatives to the traditional classroom paradigm, they are still structurally embedded in a system of social promotion, report cards, and killjoy bureaucracies.

In my own experience, gamification of the classroom can produce both positive and negative results, depending on a number of factors, including the rules of the game and the group with whom you play. I have seen team competitions propel performance, but I've also seen them degenerate into petty arguments. Posting public scoreboards seems to motivate everyone at first, before demotivating those who continually find themselves at the bottom of the list. For some tasks, such as rewarding speaking in a foreign-language class through Class-dojo, I found gamification almost universally helpful. More extensive attempts at gamification seem like they necessitate a high level of teacher autonomy and organization in order for the class to really play. For example, take a look at this rule sheet:

More character skill trees can be found here.

While I applaud the design of this rule-sheet, I'd love to talk to the professor who designed it about how s/he keeps track of everyone's progression in the game and their accompanying position on the skill-tree (read: is this what I get to do after tenure?).

If gamifying your curriculum, even in some limited way, interests you, then I encourage you to check out this article from last year's Education Weekly, which lays out eight principles for successful gamification. I like this article because of its emphasis on the fact that on a conceptual level, gamification is just good pedagogy.

mardi 3 février 2015

Coding as an L2

When people ask, as Carlye did this week, whether we should view coding as a second language, I tend to answer with a resounding yes. As a foreign language teacher, I always want to position myself as a pluralist on this issue, mostly because of guys like this who think we shouldn't even bother to teach languages. Economists like this, aren't going anywhere either. A little learning is a dangerous thing, indeed. But, as ridiculous these folks may sound, they have encouraged me to defining what I think the role of languages should be in the curriculum. Having taught in a school that allowed students to take two elective language courses every semester and seen a number of formerly monolingual students graduate tri-lingual as a result, I know what's possible. So, how could we reorganize the curriculum to accommodate this fundamental similarity between constructed and natural languages, as well as responding to the urgent skills gap for coding?

Short answer: I would institute a training program to create positions for linguists in high schools and then train foreign language teachers in linguistics. The default course in which students would be enrolled would be Linguistics, not Spanish, French, German, etc. Within the linguistics track, students could choose a language (or languages) to study as a “specialization.” Constructed languages, such as Python and Esperanto, would be taught alongside natural languages, and all languages studied would be taught using the native language/dialect of the student (where possible, obviously I don’t speak Tamil or Tagalog) as a basis for comparison. Prescriptivist “grammar” would no longer be taught, and “English” classes would be reserved for ESL. “Literature” courses would teach presentational academic writing and would be language-neutral, i.e. native Spanish speakers would become proficient in academic reading/writing in the Spanish as well.

If you think about it, this makes sense. Language, like math or biology, is a fundamental category of phenomena. Our K-12 curriculum should explore it in all its depth and at least some of its breadth. 

So, what do you think of this audacious plan? Can you improve it?

dimanche 1 février 2015

Invisible culture

Reconsidering Lara Ducate and Nike Arnold's "Technology, CALL, and the Net Generation: Where are We Headed from Here?" (2011) following our discussion about transcultural literacies with Grace Kim, I am struck by the extent to which we are dealing with more cultures than we typically acknowledge when discussing the C2 learning. In fact, I think there are always at least three, and probably four, cultures in play: the C1, the C2, the IC1 (internet-culture 1), and perhaps the IC2 (internet-culture 2). As Ducate and Arnold point out, the culture of educational technology and its intended users is a "foundational but invisible culture" with its own practices, norms, and values. Assuming that internet mediated communication is an interaction between C1 and C2 can erase the IC1 and the IC2 with which participants also interact.

Although it was not an explicit goal of her talk, one of Grace Kim's activities for our class demonstrated this point clearly. Showing screenshots of profile avatars and their accompanying captions, tags, and flair from her research on online discussion forums about Korean dramas, Kim asked us to say what we noticed about these online "artifacts." Quickly, it became apparent that what we were exploring was not just a C2 - in fact, Kim reported that there were probably no Koreans on the site - but an highly specific internet subculture operating in the brackish of Asian and Western contact, all for the glory of fandom. Hence, there are C's and IC's. In our Cultura-stye exchange with our University of Crakow counterparts, two cultures are interacting indirectly, mediated by two internet cultures.

One of the key aspects of any discussion of C's and IC's will be the recognition that there is no such thing as a unified "internet culture." There are certainly mainstreams, and norms from internet subcultures sometimes trickle into them in surprising ways, but in general, IRL cultures tend to create their own ICs rather than borrowing whole cloth from somewhere else. Where there is not yet a strong internet culture (in Papua New Guinea, only 2.3 percent of the population has net access, for example), new users may tend to assimilate to the internet culture most commonly reflected by the language in which they navigate. But, in time, I imagine that PNG internet culture will acquire a distinctive identity as well, probably incubated in spaces devoted to discussions in Tok Pisin. I leave open the possibility of a C1 and C2 mediated by common practices from an IC1 because of the increasing cosmopolitanism of the internet, capable of hosting semi-permanent intercultural communities as well as any international city. Large communities like Reddit and Tumblr, which do not have a separate domains for different geographical locations will be key to the construction of these cosmopolitan spaces. I hope to expand further on this idea of L's, C's, and IC's during the coming weeks of our Tech & SLA course.



lundi 26 janvier 2015

Re: Don't think, but look! - Constructing New C2 Knowledge

After participating in a Cultura-style online intercultural exchange like the one used in Chun and Wade's study, I feel that I've gained a little insight into one of the psychological factors that may have led to the students' tendency to miss out on important cultural differences. Reading the study, I first assumed that students were a) inattentive to the initial data presented to them, b) so nervous about talking to strangers from another culture that their higher reasoning functions were inhibited, or c) some combination thereof. While these factors may have played a role for some of the students, my own experience has me wondering about another possibility: that taking stock of one's own culture as a totalized object of inquiry is just hard, perhaps so hard that it inhibits cross-cultural comparison for many people.

In my group discussion, we spoke about Millenials and why they have such a negative reputation. It's a fun topic, and I love talking to my friends about it. However, when the Polish students with whom we were corresponding began to contribute an unfamiliar set of facts and assumptions to the discussion, a great many of the "facts" about my own culture started to seem more contingent and open to question. Like Captain Kirk monologuing haltingly at a hyper-advanced alien race with no understanding of how the existences of such tiny people with such short life-spans could possibly be meaningful, I struggled to formulate what exactly was so important about and to us. Distracted by the enormity of the task, I entirely failed to remember that one of the other members of discussion had proposed that we discuss something less serious and that we had agreed to change the subject. As soon as someone else brought up Millenials again, I jumped right back into the topic we had all agreed to stop talking about. In other words, I committed just the kind of error that makes diplomacy such a headache - I correctly identified a social cue, then proceeded to ignore it at the first available opportunity.

My experience has left me wondering: could this be what happened to the students in the study? Could their inattention to detail have been the result of the cognitive overload caused by trying to consider their own cultures as an object of inquiry from an unfamiliar perspective? Or are we dealing with something much more mundane here, as I originally thought?

jeudi 22 janvier 2015

Mobile apps my students actually use for SLA

Duolingo


Thanks to a lot of hard work and millionaire-inventor-backing, Duolingo has graduated into a league of its own among language apps. There is little I could say about Duolingo that hasn't been said elsewhere. However, as I explained last week, Duolingo is in the process of rolling out some truly fantastic teacher tools. Just this morning, I got an email telling me how much XP each of my "enrolled" students have earned this week and their total XP. It's the little features though, like the ability to download several lessons in advance while on wifi, that make Duolingo so handy.

Flashcard apps - Brainscape, Memrise, Babbel, Quizlet


This might not be hip with the applied linguistics crowd, but I like flashcards. With the large number of features you'll find in these successful flashcard apps, I don't doubt that you'll quickly decide on one that works for you. Brainscape is my go-to for self-directed learners and my own learning, while Quizlet is what I use for my classes. Brainscape can get pricey if you go in for pre-made flashcard decks, but the content curation is excellent so I have shelled out for a few, such as the Spanish Sentence Builder. On the other hand, Quizlet's web interface is fantastic for adding custom decks and sharing them, which some of my students had a hard time doing on Brainscape. For my classes, I make a Quizlet deck before I hand out new reading, and then provide them with a link to the sample randomized vocab quiz similar to the one they'll have to take in class. They can diagnose their level of familiarity with the vocab and get personalized feedback before they take the real thing, which I found is a big deal for high school freshmen and sophomores. In French 4, I tested 100 new vocabulary items a week for several weeks in a row without any major hiccups or mutiny. The other two apps, Memrise and Babel are good alternatives for people looking for a new way into flashcard study, and include a variety of question types. Memrise tends to have better free vocabulary lists than Quizlet, although my students warn me that there are some errors, whereas Babbel might easily be described as Duolingo Jr. All four of the apps mentioned here have some degree of audio and photo integration and large content-producing user bases, so you really can't go wrong!

Dictionaries - WordRef, Larousse French-English, Antidote, WordMagic, WordMagic Slang


For online app dictionaries, there is no substitute for WordRef, primarily due to the site's active forums, intercultural spheres unlike most others on the web. For speed, I also use a variety of offline app dictionaries. While Larousse ($5) has won me over for French-English, Antidote is a model of what a unilingual dictionary app can be. Every time I open Antidote, I feel as though I'm taking a deep dive into the French language, through etymology timelines, usage examples, collocations, and even a stylistics guide. In my vain search for an equivalent Spanish dictionary app, I discovered WordMagic, which, unlike Larousse, allows you to type either English or Spanish into the search bar and displays all results with corresponding country flags, cutting down on search time by a full second. The "jump" button also brings you to the next part of speech within the entry, for example, from "care; verb" to "care; noun." WordMagic Slang is a delightful parallel app that functions the same way with a separate, very up-to-date lexicon, the likes of which I have yet to find for French.

News Apps - BBC Mundo, Le Point, News in Slow Spanish - Latino


While teaching AP French, my favorite zero-prep assignment was to ask students to journal about a news article discovered in the French media. Because mobile articles (such as those found on BBC Mundo and Le Point) are shorter and designed to be read quickly, they make ideal homework for language learners. But, if you have a medium-sized budget, you can take the study of the news one step further thanks to the News in Slow French and News in Slow Spanish (European or Latino available) weekly podcasts. The minimum subscription ($35/six months) gets you 20 minutes a week of high-quality audio about current events along with a vocabulary sheet that you can print out for your students or turn into a vocab quiz. My typical weekend assignment with News In Slow French asked students to 1) listen to the podcast, 2) choose a story that interested them, 3) write a summary of the story that answers the questions of who-what-where-when-why. After a few weeks, I saw a huge improvement in listening comprehension skills, and there was always something to talk about on Monday.


lundi 19 janvier 2015

Duolingo Dashboard for Teachers

Last week, Luis von Ahn, creator of Duolingo, announced via the Duolingo subreddit that a new dashboard system has been launched in order to facilitate classroom applications of the free program. As a language teacher, I encourage my students to sign up for Duolingo and rewarded them with class credit for their progress (the conversion from XP to "extra credit" in the grade book was a little hairy, but I muddled through). Needless to say, the news that the already excellent gamified learning platform is about to get even better for teachers is truly exciting.

Looking around at the new dashboard, I found that we're still looking at bare bones: at present, all I can see are my students' Duolingo levels and XPs in table form. Previously, I took care of this by "following" my students on Duolingo as peers. In his Reddit PSA, von Ahn announced that:
In the near future, The Dashboard will help teachers understand each student’s learning needs at a level of detail previously impossible. By tracking patterns across incorrect answers and moments of hesitation, Duolingo can provide insight into each student’s areas of difficulty and provide immediate feedback in order to maximize in-class productivity. The goal is to provide a personalized learning experience to each student and free up teachers’ time to concentrate on difficult concepts, answer questions, and assist students falling behind.
While these state-of-the-art data collection features aren't part of the dashboard yet, one can readily see where all this is going: platforms like Duolingo will add a new layer of augmented reality for teachers. Von Ahn's latest statement makes it clear that he does not see an inherent conflict between autodidaxy and classroom learning, evoking the dearth of qualified language teachers in many regions and the impossibility of offering significant individual attention to every student in growing class sizes. In Von Ahn's refreshingly pragmatic vision of pedagogy, computers enhance teaching rather than replace teachers. With any luck, the teacher dashboard model will help quell any remaining doubts about the benefits of turning at least some repetitive teaching tasks over to well-designed virtual platforms. As one can see from the teacher discussion board that comes as part of the dashboard, those important discussions are already getting underway.


Don't think, but look! - Constructing New C2 Knowledge

It is natural to assume that when presented with direct evidence, our students can be counted on to notice patterns and make simple judgements about what they have seen. But, as Chun and Wade's 2004 study of asynchronous computer-mediated communication suffices to show, this assumption may be unwarranted where collaborative cultural exchanges are concerned. In a classic, "you can lead a horse to water but you can't make it drink" scenario, students participating in communication exercises designed to elicit cultural knowledge often proved unable to correctly identify the attitudes and beliefs of their conversation partners. As Wade and Chun explain, the issue at stake is whether students can construct new knowledge about another culture, and in some instances, this process seems to be blocked.

Though we could certainly spill a great deal of ink over the interesting psychological phenomenon of seeing ourselves in others - call it "categorical perception" or "projection" if you like - I'm inclined to a more action oriented approach. Chun and Wade point out that this exercise shows that the intervention of teachers as cultural mediators might help students to correctly characterize the contents of the text before their eyes, to see what they could not see on their own. As a prerequisite to intercultural learning, students must be prepared to practice the art of observation. Wittgenstein’s imperative for the study of language is: “Don’t think, but look!” The ability to construct new knowledge in the face of information that does not fit with our established psychological schema is at once intimately linked to the task of language learning and also much larger.

In the conclusion of the study, the authors consider the fact that the students had greater success in creating the conditions for a "sphere of interculturality" when the cultural differences being discussed were more dramatic. German and American responses to cheating are apparently very different, and students generally noticed this. It was in response to the more subtle cultural difference surrounding dating that students were found to be inattentive to C2 differences. Perhaps one way to scaffold the exercise in constructing new C2 knowledge would be for the teacher to prompt students to begin by discussing a topic on which the students diverged significantly along cultural lines, and only later to discuss more subtle C2 texts.

mercredi 14 janvier 2015

Internet-mediated identities

Reinhardt and Thorne’s description of Black’s research into internet-mediated learner identities describes a phenomenon with paradigmatic and ontological novelty. Black’s focal subjects were L2 students of English participating in online fanfiction authorship communities, where they found success as language learners in a way that they could not in traditional classroom settings. The paradigmatic novelty of this phenomenon is located in the shift between traditional modes of teacher-directed learning in the classroom and the self-directed learning and goal-setting (whether explicit or implicit) of the learners’ actual practice. The ontological novelty is in the relatively recent emergence of communities of thousands of individuals sharing and critiquing writing on a highly focused fictional topic, where the community creates its own positive reinforcement (e.g. the possibility of becoming an “internet celebrity”).

In reading about the study I am reminded of one of my own previous students, Benoit, who, like the students in Black’s often fell short of being a “successful” language user in my classroom. Despite a keen and perceiving intellect, Benoit’s progress in French was hampered by an irrepressible urge to socialize in English, counter to my classroom French-only policy, and poor study habits. In 2013 Benoit, like a number of my students, was drawn into the League of Legends gaming community, numbering some 67 million people. Unlike most, however, Benoit had had a great deal of championship success in the strategy game, which I have heard described as a form of “high-speed, team-based virtual chess with exponentially more possible game outcomes.” I recall the day when Benoit came into class and told me offhandedly that he had begun to serve as an amateur interpreter for new French players on the server, helping to acquaint them to the dynamics of team play and serving as a link between the Anglo and French members during matches. Seizing the opportunity, I told Benoit that I would be happy to accept some form of written work from him about what he had learned from these experiences, and we agreed to a vocabulary list of new French words and phrases discovered during these encounters, which he would update every week. We had a deal.

Ultimately, Benoit ended up squeaking by with a B- in my class, and dropped French the following year. However, I am still sure that I was able to improve his negative relationship with the traditional work involved in language learning. At the very least, Benoit and I stopped our tug-of-war around French, and he had at least some work that was almost pure pleasure. And, unlike most of my other students, he was putting his knowledge to use in a practical way for a real community, a lofty goal even for very advanced students.

dimanche 11 janvier 2015

Multi-literacy means Authorship

In educational circles, one now regularly hears appeals to notions digital literacy, cultural literacy, media literacy, and other literacies. Increasingly, the term "literacy" is being applied to skill areas other than proficiency at reading traditional texts, deliberately calling attention to the multiplicity of codes in use in daily life. Educators have begun to realize that these codes are not just another domain of knowledge: it is essential that students to become "fluent" in these multiple literacies to ensure not only their success but to arrive at even a basic understanding of the world around them.

Unity 3d - a free platform for the development of virtual environments
The potential for authorship is a condition of possibility of literacy. The availability of free tools for the creation of 3d environments and mobile apps is thus not merely key to the development of highly advanced learner, but a precondition to the development of any true digital literacy. Just as we would not consider someone literate in traditional academic subjects if they were unable to produce writing, we should not be content to allow another generation to pass through the school system without becoming proficient at coding for the very same digital environments of which they are consumers. For this reason, NCTE's requirement that literate students be able to "Create, critique, analyze, and evaluate multimedia texts" must be taken seriously and at face value.

But, if we take multi-literate authorship to mean exclusively "learning how to code", we have missed the point. Though the appeal will likely go unheeded, the message that we must teach students how to code is being shouted from the rooftops of Silicon Valley. Smaller voices point out that, in the words of Thorne and Reinhardt, "contemporary conceptions of literacy have expanded beyond the narrow, if also necessary, skill of decoding and producing graphical texts." New literacies are situated in a world that is more interconnected than it has ever been, and it is a question of when, not if, our students will be required to "build intentional cross-cultural connections and relationships with others so to pose and solve problems collaboratively and strengthen independent thought" - NTCE, once again. Consequently, our students will need to get a sociocultural knack for the codes, digital and otherwise, with which they come into contact.

Again, the best and perhaps only way to help our students take this perspective is through authorship of socioculturally aware texts, which are reactive without being reactionary. Here, the ideologies of naive liberal multiculturalism couple with xenophobic ethnopluralism to block the construction of the critical discourses of difference that will form the basis of a new pedagogy of authentic sociocultural authorship. Accordingly, the construction of multi-literate authorship faces not only the challenge of disseminating new knowledge but of enacting ideological changes within the educational system.