Manifesto for Teaching Online – Aphorism No. 13 – “A routine of plagiarism detection structures – in a relation of distrust”

With respect to plagiarism, I don’t know about you but my mind is always plagiarising, and surely if I believe what I read or see on TV, stories of “selfish genes’ then I have genetically plagiarised as I’ve taken parts or copied them in one form or another from others without citing them, or even given them credit (although I do respect my parents and forefathers, mothers etc….).

Let’s be candid, we only learn to speak by imitating the expressions of others, most notably, parents, siblings etc. We learn to think and decide by watching these others, listening to advice, receiving instruction from peers, teachers and others, copying dress sense, playing cooking games and so forth. Imitation is thus integral to human nature and the production and reproduction of culture and its behaviors and artifacts. Mimicking is all over. It comes as no surprise then that I have similar way of expressing things as many others. I must adhere to convention to be understood and to be accepted. But I can consciously or unconsciously digress – at my social peril. It is the difference that makes a mark, makes for a change, and can be considered ‘innovative’, ‘alternative’,’different’.

Judicial use of difference gives rise to those slippery terms ‘flair’ and ‘ingenuity’ precisely those qualities that would are popularly lauded as desirable outcomes of an education process, but so often curtailed by the practicalities, bureaucratic and administrative coshes of the standardised methods (cf. catering to ‘gifted’, “challenged’ or ‘individual’ students properly) that suit the crowd. Since the 18th Century, in certain forms of writing, such as academic, literature and journalistic writing, and in deed (pilfering someone’s invention) has been considered dishonest. The timing of this coincides with the rise of industrialization and the notion that there was a way of life characteristic of modern (or industrial) societies that was qualitatively different from the way of life found in pre-modern (or folk) societies goes back, at least, to Weber and his íron cage’ of bureaucracy and rationalization. everything mechanized, administered according to the dictates of scientific reason, categorized and placed in neat boxes and cabinets lest wooly chaos enters therein. This ethos drove us to consider what was plagerised or ‘stolen’ and what is not.

I have touched upon this issue many times in the previous posts of this series. The issue of trust is very important, though and relevant, especially when speaking of all those ‘faceless others’ using electronic means of communication where people can just about be or act textually as anything they want (I think here of Dungeons and Dragons and other MUD games, the the early chat rooms which have become messenger and Twitter, or course video chat and Skype has changed that). Sherry Turkle suggested that introverts can become extroverts online, that online presence may not relate to who or how you are. In an emerging culture of remix, mash and curation identity and academic plagiarism not only keeps its centre stage in debates regarding the ethics of TLA, but takes on new relevance due to the capacity of the Internet as a aide-mémoire or a depository of made-to-measure ready-mades:

“ this vice (plagiarism) has ravaged the academic and research communities for years, it has reached epidemic proportions because of the Internet.” (Kinza, 2009, p. 7)

Amongst the reasons cited as to why it is a ravaging plague is given by (Batwane, 2010):

“Plagiarism affects not only individual students but also the integrity of the institution as a whole and the quality of its products. Therefore, it is important that each university crack down on this problem for its own sake and for the sake of the students”.

Such depictions make it clear that those who engage in such activities are not just cads, but spreaders or carriers of behavioural pestilence of which the internet acts a well-fitted breeding ground. They are certainly not trustworthy souls, and they can even tarnish the reputation and identities of their institutions. And it is not only students affected, but academics and researchers are also using the internet in an unethical way. Professor and student heads have rolled because of it. The internet has of course been an acclerant of many disreputable and dubious practices and materials.

An interesting sideshow to these outings of plagiarists is the concept of the curious category of self-plagiarism which occurs when an author re-uses portions of his or her previous published work in subsequent works. As I said earlier, I am always self-plagiarising as I dwell on things in the mind, but of course, in this case it is applied to the rules of the game and engagement in the academic culture of ‘publish or perish’, career development, and departmental and institutional profile and rankings. In all aspects it casts the vision of a battlefield, or at least a cat and mouse situation – on one side are the Jerrys, the ethical academics using wit and tools to uncover plagiarism – and on the other are the Toms, students [and mundane academics seeking to develop their careers through deception, including self-deception and intellectual pretence), they are using other cunning apparatus like those sites selling readymades, fresh off the press or too old to be discovered, not too used and anything else in-between in the grey zone o’ piracy, cut and paste.

I have come upon this problem in its extreme working in overseas universities where tuition is in English, and where it is also a second language. Not only do many of these students lack the language skills they also lack the core academic skills to research and construct essays beggaring the question of why they are in the class at all. The most extreme case of this was my early days in Phnom Pehn, where one class of 30 students submitted three flavours of paper and three only. The brightest of the guys had discovered essays on the internet which clearly they had copied in their entirety even though they only barely related to the topic that was given, but that didn’t stop them distributing them to all those around them, some of which could barely speak, read or comprehend English at all.

These people had the unenvied and no doubt serious intellectual problem of choosing which one of the three they would subscribe to and use. The only thing that was differentiating these offerings were the covers. Some of the ladies in the class had adorned their plagiarised essay with pretty flowers, others had patterns, even tartans on them, and the contents themselves showed this years fashion in crazy typefaces, even an readable, heavy thick German Gothic true type and lastly, of course, the names were different.

On submission of the essays I glanced through and immediately understood that they were useless if we were trying to play the game. I asked them to sympathise with my job as laid out in the rules of the game as I had learned them: “you have set me a great problem here everyone… how am I to mark these, do I give this essay more marks because I like the name, or do I play this off against the lovely flowers on this cover… I do not know how to differentiate the mark otherwise…” was I being facetious, or was I getting some message across, some of them smiled suggesting conspiratorially they did know the game, somewhat. It hit me as existential shock, that really what we should be doing something very simple and basic, rather than pretending to be doing an MBA course on organisational behaviour. In fact I wondered if I, or they, should be there at all? I continued to teach them as best as they, or I, could manage. I understand they all passed, but I would not sign off anything playing the ethical academic.

Now, my dilemma was that’: ‘should judge this and frown, after seeing worked fudged, fiddled and manufactured in reputable world class institutions with their own Machiavellian power games, the manufacture of publications and poor inadequate teaching played on a daily basis? I came around to thinking that maybe their wasn’t that much difference. Sure, the global standard North and West universities had some very smart people inside, those with fine powers of observation and reasoning, nerves of steel, and sometimes hard necks, gumption, but they also drew the first students who had similar traits from the right schools – this along with the gloss of a long and distinguished history, some outstanding scholarly contributors, Nobel prizes, etc., and of course massive funding and wealth drawn from research grants and donations, all of this creates a veneer of a legacy power house, which on occasion, masks guilds of complicity, complacency and back-biting and in my opinion, criminal wastes of brainpower, critique, invention and ability.

But the institutions are not to only to blame, there is the whole extant mechanism which perpetuates its existence and behavioural traits, amongst these are the league tables, reputation, public profile and publications. The latter often rests upon data and knowledge which will never be read, let alone re-tested [the original data in the file cabinet is rarely examined], and the topic is as open to fads and fickle fashions as much as teenage girls are drawn to new configurations of clothes, colours and boy bands. The fact that only proven hypothesis are published has long been recognised as skewing that which is actually disseminated into the society of minds while mitigating the big picture of all that which has truly been discovered. What finally descends into the wider public domain can be pure pulp fiction, moving through the lens of the popular press. I remember one corresponded that gave us a crash course on writing for tabloids [write as if for 6 years old – “Today, a bunch of eggheads got together at Edinburgh University to speak about what will happen when crackpot devices speaks back], or writing for broadsheets [write as if for intelligent 12 year olds – “A team of boffins will gather today to discuss the amazing future of smart or intelligent devices, including paint that changes colour at the press of a button, and magic mirrors that will tell you if your healthy or not. Can you just imagine…]

Published results tend to support theories with everything else relegated to the file cabinet, if it is relegated at all. And is this not meant to be picked up by the peer reviewers? Not necessarily according to Smith (2008) writing in the Journal of the Royal Society of Medicine.

Peer review is at the heart of the processes of not just medical journals but of all of science. It is the method by which grants are allocated, papers published, academics promoted, and Nobel prizes won. Yet it is hard to define. It has until recently been unstudied. And its defects are easier to identify than its attributes. Yet it shows no sign of going away. Famously, it is compared with democracy: a system full of problems but the least worst we have…peer review is a flawed process, full of easily identified defects with little evidence that it works. Nevertheless, it is likely to remain central to science and journals because there is no obvious alternative, and scientists and editors have a continuing belief in peer review. How odd that science should be rooted in belief.

It is not a pretty picture when you place this with pleas such as those made by Bauerlein, et al. who are not alone in the concern about the ‘avalanche’ of unused or unusable research’ coming the the academie. Lehman and Loder (2012) in an editorial in the British Medical journal are also pensive and call a “culture of haphazard publication” – where inconvenient or unresolvable details are left out – means that decisions are not made using the best evidence.

…Consider this tally from Science two decades ago: Only 45 percent of the articles published in the 4,500 top scientific journals were cited within the first five years after publication. … In a 2009 article in Online Information Review, found that 40.6 percent of the articles published in the top science and social-science journals (the figures do not include the humanities) were cited in the period 2002 to 2006.

Although they do not offer an explanation of what a clearly states what constitutes a ‘low-quality research’ paper is, and of course, I am conflating here critiques drawn from views of many disciplines – psychology, medicine, engineering – which lay claim to being scientific. However these are clear concerns regarding the apparatus, funding, political, cultural, bureaucratic and administration structures and behaviours that perpetuates the modern university, and more than that, in the case of medicine perhaps in particular could affect people’s health. They all need, each layer, to be examined in the light of significant changes in global power and wealth and distribution of resources. But I suspect that the momentum is too strong, the poor practices are too deeply engrained and at times the only way in which aspiring academics can make their name and keep their jobs. Hence the push to publish more in order to obtain more grant money for the Department from the public purse. The ‘publish more’ craze may increase the overall knowledge to the field but it does not increase the utility of the endeavour. The practical use of much of this research wanes, and many reports lie dormant on disks and filing cabinets. Format requirements for instance, such as strict word limits, for papers, mean cutting the details regarding previous work. And because of this the published results can sound not only surprising, but even novel and innovative. Marco Bertamini and Marcus R. Munafò note, ironically, that: “A bit of ignorance helps in discovering ‘new’ things.” This sounds so familiar in the wider culture, where reduced sets, abbreviated language in tweets and texts, and blog entries look to unexperienced minds as new revelations.

Hence the push is to publish more in order to obtain more grant money, which again, like the best teachers and the best students and the best money flash crowding in the Cambridges and other brain powered, boat racing hot spots, a fairer system can come into places. Also this ‘publish more’ craze may increase the overall knowledge to the field but it does not increase the utility of the endeavour.

Finally, there is the domination of certain methods and operations adopted by disciplines. Holbrook (1995) drew attention to the fact that business schools in the 1950s, were more or less scared for their very existence, largely under attack for being ‘non-scientific’, embarked on projects which included studies that were of a more positivist, quasi-experimental and statistical, and managerially relevant nature, in order to appear more rigorous and win more academically respect. In a sense they followed the other social and behavioural sciences which had already migrated to take on a more scientific allure in the earlier part of the 19th century, and found greater favour in influencing public economic and health policy. But the turn to numbers in many of the behavioural sciences, then opened the cans of worms regarding issues such as confirmation bias.

…confirmation bias connotes a less explicit, less consciously one-sided case building process. It refers usually to unwitting selectivity in the acquisition and use of evidence. The line between deliberate selectivity in the use of evidence and unwitting molding of facts to fit hypotheses or beliefs is a difficult one to draw in practice, but the distinction is meaningful conceptually, and confirmation has more to do with latter than the former. The assumption that people can and do engage in case-building unwittingly, without intending to treat evidence in a biased way or even being aware of doing so, is fundamental to the concept. (Nickerson, 1998)

One of the conclusions that (Scargle, 2000) drew to attention is that apparently significant, but actually spurious, results can arise from publication bias, with only a modest number of unpublished studies. Meta-analaysis of these studies would also by extension be comprised. lets face it though, politics and corporate business is obsessed with numbers – they truly blind by lending scientific and authoritative gravitas to reports and presentations. You can dazzle, blind, impress, or intimidate with numbers, graphs sophisticated and arcane statistical techniques, much in the same way that Wall Street financial engineers have been doing with CMO, or pairs trading or VaRs and not forgetting NPV. [nice of me to put in the links for ignoramus’ like me – basically all these things mean the same – you lose your shirt, job, car and home].

What is clear from this discussion is that technology, and particularly the web and application use is placing pressure upon all manners of institutions to change. This is not a naive technological deterministic position, but rather one which recognises various pressure points, across the entire social system, points exerting different pressures and prejudices.

But returning to the particular moral panic raised by Newson et al. reminds me of other moral panic predecessors in the past. They largely take the shape of the argument:

“technology innovations enabling extension to the capacities of media to convey its messages attenuate socially defined human failings.”

That’s an aphorism. In some cases, like the history of innovation in military technology leading to nuclear, chemical and biological weapons of mass destruction, there is the argument that there has been an increase in humanities core capacity to maim, destroy and kill. Violence becomes ‘meta-violence’ or terrorism. In nuclear plant breakdowns or air disasters, could it be that many people express a sense of relief to discover it wasn’t operator or human error – “the pilot was drunk” – but rather it was a technical or systems failure or design weakness, and therefore, more easily remedied or fixed, and thus preventing it ever happening again [like that at least]. Technical or system problem means less of a wicked problem than human weakness or failure. In polite company we prefer reasonable and rationale people, just as we prefer reasonable and rationale children, we live in fear that lack of it will meet with disaster.

A similar kind of logic is apparent in public health studies of mass and new media for some time now. In an infamous article by Elizabeth Newson published in the Journal of Mental Health and in the Psychologist (1994) supported the view that violent videos can lead to violent actions.

Newson, E. (1994) Video violence and the protection of children. Journal of Mental Health, 3, 221-226.

Newson E. Video violence and protection of children. The Psychologist 1994; 7: 272–74.

The paper was countersigned by numerous and eminent child health professionals, and really played into the hands of a press which was eager for answers. Guy Cumbarbatch quotes the Daily Mirror, of April 1st 1994

“VIDIOTS! At last experts admit: Movie nasties DO kill.”

It terrorised everyone who had watched a horror movie. Was it that they could suddenly, even unconsciously, commit Jekyll and Hyde style heinous crimes. In her paper she stated that as multimedia technologies emerge and become more sophisticated, then so would their capacities to create more pervasive levels of negative influence, social problems and dangers, tortures and murders will exacerbate as people interacted on a more immersive level with games and virtual realities. The knock-on effects of this report were pervasive. They influenced censorship in the UK, and even influenced the thinking of the company that I was conducting my doctoral work in. They were developing digital interactive television services, and were favouring the ‘walled garden’ approach for family viewing, over the wild west hat they saw in the internet, with its porn, bomb making and other unsavoury family viewing. [although there are clear economic benefits from successful built walled gardens and successfully ring-fenced and captive audiences].

However, the Newson Report was not without critics, Guy Cumberbatch who had been studying the psychological dimensions of media for many years and notes that:

Each new medium has provided the focus for concerns that them has been a recent and unprecedented rise in juvenile crime and that this has been caused by a new medium which is unprecedented in its glorification of crime and violence. Such concerns are the basis of Newson’s (1994) report but appear to be nothing more than speculation fuelled by the popular press.

And another commentator Martin Barker has also spoken of the development of media sparking concerns that their ‘effects’ invariably overwhelmingly negative or positive, utopian or dystopian.

Newson’s report had far reaching impacts politically and in public consciousness. But her indexing of the sophistication of multimedia, as it moves towards full virtual reality, was misguided and based entirely upon conjecture, as much of the rest of her views were. The point which brings it back to topic is that there seems to be a juncture just now, where the internet is a vast resource, such as we have never had before, to have at hand, lends us a corpus of information never available, and certainly not easily available to a wide body of people. This is not only helpful, but really brings into clear focus a vision of the computer as cognitive augmentation, greatly improving our capacity to work with ideas, to create ideas, to hybridise ideas. Online libraries and depositories have made what took a lot of time before, much more immediate. Searches can also be much more thorough, than trawling through books and index cards. Lest I seem to be verging on techno-evangelism here, as we have heard it all before, this is the reality of what is available.

Having so much access to so much material also throws up quirks from the past which stimulate and become the stuff of blogs, and thoughts regarding where we are at in the archaeology of knowledge. You are only bound by your knowledge or knowledge or knowing how to look, what to look at and where to look, and how to understand and make sense and join the dots in a way which maintains the conventions of eloquence, style and authority in writing. Clearly, it is a predominately negative view of the internet which is presented here under the auspices of plagiarism. Bennett (2005) in his study of plagiarism puts forward that one factor involved is “means and opportunity” The fact that resources are ready to hand, available and easily accessible over the Internet makes it convenient for students to gain instant and easy access to large amounts of information from many sources., Wow isn’t this what the internet offers rather than this frame where it is a bad thing for the business of academia? Not according to other researchers who more or less say the same thing (Mozgovoy et al, 2010, list’s Bennet and a few others).

I think it is difficult in many ways to sustain the argument against plagiarism without repeal to a view that things were better in the ‘good old days’ of typewriters, paper books and journals and even handwritten essays (which would at least show that an essay was written by those submitting – you would know mine as it would have my wax seals on it – coffee cup stains and handwriting like a 6 year-old due to computer overuse during the last 30 years). But when you have a look at the bewildering amount of books in a decent and massive library such as there is in The University of Edinburgh, let alone it augmented by the public libraries, the libraries of the other great institutions and the National Library of Scotland, and then surely plagiarism was very difficult way back then, as it still is now. Sure Ctrl-C Ctrl-v pundits and cultures makes kidnapping a section or two time-efficient, but I am also at the receiving end, also with the same kit, inputting any suspiciously well written pieces into the same search engine and seeing what crops up. I don’t need Turnitin or any of these devices to be a Tom poised outside the mouse hole waiting for unethical mice.

Let’s face it, while writing essay seems laborious to students, marking essays is also laborious. It is most tedious at rudimentary level, where they are piles, and simply due to repetition, repetition, and repetition. I prefer the intellectual rapport and uniqueness of a crunchy MBA or doctoral thesis, even if it is chock full of figures, you learn something from this work, and stretch your own thinking. Some professors have even used this level of material to fuel their own research outputs. I have certainly heard this is the case where western academics get jobs as ghost-writers for Professors in eastern universities, who also lack the language skills to publish in international journals. We will not delve in to the raging debate on the hegemony of the English language when there is good evidence that Google and co, and others with global ambitions to ring-fence knowledge, the utterance and communication have devoted so much capital and work in to localising their services (to command local advertising with local keywords in local scripts?, how do you say Angry men in Khmer or Sesotho? Don’t spoil it)

If you are lucky, you may have an answer key for multiple choices, or even a machine reader, which is great for doing potentially doing hundreds of first year work. But when it comes to essays, ooops, then the serious work kicks in. It all slows down and becomes ponderous, why can’t we have an intelligent way to do this, but we do, in systems such as Turnitin.

I’ve already given reference to my system of coping with essay marking earlier. Could it be automated, after all it’s a kind of batch-processing model, with added intelligence in the interpretation of the contents? First off, you would have a look at spelling and grammar – language problems, even Microsoft word does that but this is an issue especially in an environment where there are a lot of international students and the ethos is a politically (or commercially) correct, “don’t be too hard on them for not knowing the language.” But spelling shouldn’t be an issue with word processed documents, with spelling and grammar checkers, and dictionaries online and offline. Sentence construction may be a problem, and it is a big problem for the reader trying to compensate, i.e. trying to grasp the ideation ‘behind’ or ‘between’ or ‘instead’ of the words. If it is, then surely extra English writing tutorials should cater for that lens. Sometimes it goes very deep. In china and many Asian countries which have favoured the rote learning approach carefully detailed in the previous post, is often attributed as the reason not only of how they are uncreative, and have perfected IP theft and appropriation, but also for the proliferation of plagiarism in their education systems. It has been cited as capable of damaging its economic future. To copy, emulate and mimic is basically the ‘hidden curriculum’ of the school.

Back to our essay writing class, is the essay shape and format, right length, beginning, middle and conclusion, how does it look, are the sentences too long, paragraphs too short? Next arguments and citations, this is getting in to the crux of the matter. All this makes an essay. It starts with the most mechanical, and ends with potentially the most creative and open-ended, moving from the lower echelons of Bloom’s taxonomy to the higher levels, it moves through thesis, antithesis, synthesis or premises, inference, and conclusion. Interestingly enough an early 1980s paper details an automated plagiarism detection system. It was felt that the 1% of programs that were submitted which was deemed plagiarised were far less than what the grading staff had “independent knowledge of”. This called for an automated response that would help them catch culprits [Interestingly this contrasts with Batane’s 20.5% of students cheating at the University of Botswana]. Isn’t it right that automated plagiarism systems are born digital?

In a beautifully crafted and submitted essay, the ethical lecturer then moves towards the idea, “is plagiarism going on here.” What happens when a first year psychology student hands you what amounts to a draft of a paper suitable for publication, all citations, formatting, spelling and grammar and arguments all good? Then after chatting with them you discover both his parents have Ph.D.’s, one a clinical psychologist the other an academic? Is it nature or nurture or just plain cheating?

It going to be worse checking it out with a peer who will have command of the language and mask it, make it seamless.

But to the problem at hand. When does a template become plagiarism? When does configuration of letters, words, and sections? When does, ideas, or proposals, reports? What about when ‘publish or perish’ demands of academics that they write something about anything, in the correct format, so that it gets published anyway? What about when we use methodologies? What about when entire disciplines copy techniques from another. Like business studies were scared in the 1950s and 1960s that they were not being scientific enough, and started using experimental style statistics to make them look like they were more academic and intellectual. The granularity of plagiarism had been addressed by Collberg et al,

Cosmetic changes: Minor cosmetic changes to text, such as the addition or removal of punctuation or spacing should not affect the comparison.

Reordered text: Paragraphs or sentences from paper A can be copied but placed in a different order in paper A’. If the bulk of the content is the same, however, this should still be detected.

Reworded text: Rewording of text without significant change to the meaning, such as the substitution of a few words for synonyms or swapping clauses of a sentence, should be caught. While not nearly as severe as directly copying text, a significant amount of this should still register as plagiarism.

In a later paper submitted to the ACM Colberg and Kobourov (2005: p.9-10) update these criteria thus:

Textual re-use: Incorporating text/images/other forms of previously published works in a new work

Semantic re-use: Incorporating ideas from published works in a new work

Blatant re-use: Incorporating text/ideas of previously published works in a new work almost verbatim

Selective re-use: Borrowing bits and pieces from previously published works in a new work

Incidental re-use: – Incorporating text/ ideas not directly related to the new ideas in the current document

Re-use by cryptomnesia: – Incorporating text/ideas from previously published works in a new work unaware of its existence

Opaque re-use: Incorporating text/ideas from previously published works in a new work without acknowledging its existence

Advocacy re-use: Incorporating text/ideas from previously published works in a new work when writing for a different audience

There is a view which I am sure that most of the references cited here would find contentious. And that is let the plagiarism begin!

Like the common sense idea of basing a university dealing with the business of hospitality in a tourist resort, one that deals with development studies in a developing country, one that concentrates on finance and banking in the city of London (actually City Univ is such a beast) or Wall Street, one that focuses on military studies in the middle of Iraq or Afghanistan? What about teaching the economics and practices of corruption in countries where it has been recognised that it is rife and the only way to succeed and get on why not teach plagiarism in all its splendours and multivarious forms. Less grandiose a claim is a kind of paradox when I did run a class in ‘creative plagiarism’ for a bunch of Cambodian students that had no idea on how to do a good presentation or present a decent academic essay. Their plagiarism was rougher and courser compared with say, Malaysia, where it was smoother and more thoughtful in some cases, or as it would be back home in the U.K., with someone who understands how to construct a paper, or a rich presentation on PowerPoint or excel, like a bright student or even a lecturer. They understand the mechanics of being a good cheat. The idea is that if you falsify or cheat so well, you will end up doing it. Just like professional proposal writers in development and in academia, they know the right language, the trendy language and even the right people they can enrol to beef it up. They know when to be polite, assertive, evasive, and basically get paid for very little sweat and contribution, as they have learned to mirror and provide at every turn, precisely that which is expected of them and no more.

Instead of batting my head against a wall, I chose instead to run a class on creative plagiarism. I used it a lot with the students who after getting around the ‘guilty pleasures’ and taboo aspect of the word, started to grasp where I was coming from. I wanted to use reverse logic or reverse psychology on the art of composing work. Think about improvising musicians. One person starts with a riff and the others then follow, and work off that riff. That riff could be plagiarised, but once the original player removes it and starts to improvise based on the responses then it becomes something different. It used as its departure point, the known, the claimed, the recorded, but it now moves into its own territories. This is the essence of creative plagiarism, it may be considered a necessary evil with people who have not been exposed to adequate core skills like essay writing and research. That would be like throwing somebody in who had never played an instrument in to the jam sessions.

I was showing how easy it is to knock something up, and consider format and function, rather in the ways in which the heuristics of anti-cheat applications can teach us how to be better copiers or writers, just like viruses or bacteria mutate in order to survive to use biological instead of musical metaphors. In class we put proposals for a persuasive speech in a hat, and I picked one out at random, with the title that “women make better managers then men in the corporation.” Armed with Google, I used its magic to conjure up facts and figures, some pictures, all of which I referenced properly, at the end to put together a talk on a subject that I had never really thought about before, or may not even subscribe to. I then delivered the course with conviction, and had the students do the same. We looked at videos of notables giving speeches, modelled the attitudes and tried to copy these behaviours.

In another class for journalism students, I asked them to use a blueprint of a football report, but use its form and use of jargon to describe a local game which they would attend. The mastered the discourse of football reporting, and got a feel for the stylistic consequences by stepping into the shoes of the writer by copying. After a second and third attempt they began to improvise, and discuss how there were tensions arising between the events in the blueprint and the event that, they themselves experienced. There were still using the discourse, but now writing more original and accurate depictions, which would better resonate with those that were also there. They were moving on, learning by doing. There is nothing unethical about this process of deconstruction and setting of new contexts, using a vocabulary borrowed from elsewhere but learned originally through plagiarism and not trying to play cat and mouse.

This is the essence of an inductive method which I used with other students on an entire course on event management. Two groups were given the task to pull apart an ongoing tournament which was happening in a neighbouring country whose demography and socio-economic situation was very similar (we were in Lesotho and the event was held in Swaziland). We explored each week a particular constituent aspect of what makes up such an event and how this could be grafted – plagiarised – to the local situation. The students, having been exposed to book and lecture based subjects, were taken aback with the reality of having to explore local, regional and global resources that would be required to make it happen. They had to benchmark qualities of products and services with the costs of sourcing them internationally, they had to consider how this other event was promoting itself and how to generate revenues from advertising and ticket sales, they had to broker deals with local hotels and design shops in order to build their business plans.

In all their whole model had many aspects and faces, the outcomes were mixed, but they now had comprehensive experience of doing what they were purporting to learn on the course. Theory and practice melded. Vicariously and social learning is natural to human beings, and included in this is copying, re-using and re-appropriating and incorporation, assimilation and amalgamation. Refiguring and reconfiguring are key to innovation, and digital means are the last in a long list of technologies which have allowed for this.

Now I know that some will say – but this is too broad to serve as a critique for academic plagiarism – drawing us back to the idea that some people are just inherently lazy or incapable of doing proper work, proper citations, claiming other’s hard gotten gains for themselves etc. etc.[75% of Batwane’s 20.5% of plagiarisers cited laziness as their reason], but I have not touched upon digital rights management (DRM0 and illegal file sharing as how corporate media protect their machine is for another discussion and it is still going on with ACTA in Europe.

This is a subject I have also researched and worked with at Imperial College [in particular MPEG-21 and mUffiNS). But I believe that any idea worth its salt with be used and reused in an open source manner, and the creative commons and those who have the leisure, time and funding to engage on producing for it show us that it works. Those that take form it and makes money from it should use citations. Academic and scientific ideas, funded from the public purse, well shouldn’t these ideas be freely circulated and used, instead of sold? If I come up with one good song and never have another hit, should I live my life around it?

About this entry