DISTIL wins Two Key Awards at DevLearn 2008!
I've been working on a very challenging project over the past year, building meaningful assessment into learning games. My desire is that success on this project should serve to transform the experience of learning for some very lucky individuals.
The primary motivation for me to build this system is related to two chunks of my life, the first chunk in which I was a student and the second, in which I was a professor.
I am an active learner, which is something I understood when I joined university. I learn best when I can actually build the stuff, or at least play around with, what I am trying to learn about. This was probably a primary reason why I shifted from the economics programme to that of computers during my undergrad. With economics, you can only come to conclusions after the fact, based on historical data (unless you have the ability to influence policy, and can play grand experiments with national economics). With computers, you have an idea and you can fashion the threads of logic, using the scaffolding provided by hardware, software and interface devices into useful information processing 'machinery' and can see the results appear almost instantly. The stuff of ether, of thoughts, magically woven into useful stuff. It's poetic.
I should count myself fortunate that I did not go for chemistry, psychology, or criminology. The thoughts of the 'experiments' that are possible there are simply too scary to contemplate!
Getting back to the topic, I feel that interactive learning simulations would be great for individuals like myself to 'experience' abstract processes and concepts, carry out 'what-if' experiments and truly understand what we are trying to learn.
The second relevant chunk of my life, was when I was a lecturer at the National University of Sciences and Technology (NUST) in Pakistan. The big problem I had there concerned assessment. I was happy with the syllabi, the course material and the students that I was working with. The only bit that I had difficulty with was deciding the grades that I would assign to students.
I tried to spread the grades out over as many points as possible, and would assign projects, quizzes, had two mid-term exams, and one final exam. I even experimented with class participation and peer-evaluation, which got the underwear of the administration into a serious knot. Looking back, I was probably regarded as a bit of trouble-maker, especially as I was teaching in a quasi-military institute, with well entrenched traditions of following established processes.
My attempts finally converged to a system that I was willing to accept, and which the majority of the other stakeholders were satisfied with, however I was still not happy. The grading based on objective criteria did not take into consideration the thought process that went into the steps that the students took. The grading based on the subjective criteria was open to challenge as it was by definition relative, as it involved comparing the efforts of individuals in the class population against each other based on abstract, quasi-objective criteria that I created. Each subjective evaluation project would result in a complete research in which I extracted the 'best practices' from the completed projects and design problems that I'd assigned. The task of comparing one person's attempt against another's was a difficult one, even with well thought out quasi-objective criteria.
Well, I took all that experience, mixed in my understanding of games and learning. Put it all in a pot, and simmered slowly. After over a dozen failures, I believe I've worked out some of the kinks and have a credible method of assessing competencies in games.
This method will only succeed if the games are actually designed properly, with actions in the game paralleling those in the real life environment. I'm not going into details how these were achieved in this post, and I would welcome an email from those who want to learn more.
The buzz this generated was incredible. I could feel the love! When this technology was shown to the main thought leaders in eLearning at the premier conference (DevLearn 2008) in the US, it generated very high levels of excitement. Truly, beauty lies in the eyes of he beholder! This beast, which was forged in a machine of iron, silicon and plastic, was seen as being a game changer. It's very motivating to see glimmers of a better future taking shape in the work that you're doing.
A mention of the award is available here.
The team I work with at DISTIL is absolutely incredible. My assessment machinery is based on a game that the folks in studio and engineering have put together which makes sense from a learning perspective, and which simulates the process that a standards implementation representative would need to go through. The whole experience is packaged nicely in as a visual 2D simulation which is actually fun and engaging to play. It exciting to see the new products that are being worked on, and keep tuned to this channel to learn about the new flavours of serious games based assessment that will accompany the new games.
For now, I am happy that the pieces that I contributed to this effort are made a difference.
Subscribe to:
Post Comments (Atom)
1 comment:
Completely agree, digital games are useless for learning if they don't assess the player's actions. As more learning professionals see the richness of using computer games for assessment, they'll turn away from older grading methods like multiple choice tests. DISTIL's been lucky to attract folks from various disciplines who, like you, were convinced that we could make technology to match our vision.
Post a Comment