Let us hope that "ICT" is dead and buried

Jump to: navigation, search

Peter,

Re. ICT is dead - long live ICT

I agree with the importance of defining terms (I was arguing this back in January 2012), but not at all with the taxonomy that you propose. I also disagree with your last swipe at Gove, regarding unintended consequences. If you are talking about the clarification of definitions, this was very explicitly the point of the whole exercise that was started by “Shut down or restart?” — stuff about “damaged brand” is just PR speak for those who do not understand the details of the argument about definitions. If you are talking about adopting your definitions, I should wait a while before assuming that this is going to happen at all.

Digital literacy

“Digital Literacy” seems to have been replaced by “Digital skills”, precisely to avoid any implication that anyone is talking about the stuff you are promoting: “21st century skills”, “21st century citizenship” and all of that stuff, which I have argued elsewhere is bogus. It is going to be about the feet-on-the-ground (or “very narrow” as you and NAACE have it) definitions proposed by the Royal Society.

Technology Enhanced Learning

“TEL” should be avoided for reasons I have already described. If you do not want to read the whole article, my central point is illustrated by the definition you propose. You talk only of the use of digital technology, even though the subject-specific technologies that we need to transform education do not generally exist. This has been the central reason why the whole Becta experiment failed and why the sort of training programme that you were delivering through Vital was always premature (as I suggest to you when we first met, at the time that it was first being established). We should use “education technology” instead, because it covers both development and use of such technologies.

Embedded technology

I disagree with your use of this term for three reasons.

1. As already argued, we “embed” things when they have a different purpose to the surrounding environment (a reporter in an army unit, for example) but we do not embed things which serve the main purpose of the surrounding environment. We do not “embed” tables and chairs in classrooms and neither should we “embed” weather stations in Geography lessons or MIDI instruments in music lessons. I think you understand this really: the problem is not that you use the term “embed” incorrectly but that your justification hides your true purpose. You really want to introduce different aims into other subjects under the pretence that the subject has changed in ways that you understand better than native subject experts. Your use of this term therefore signals your desire to act the cuckoo, laying your agenda in everyone else’s nests.

2. In my view, you have grossly over-stated the claim that technology is changing the nature of the rest of the curriculum. When you look at the substance of the argument that you have made, it boils down to the fact that the English curriculum should refer to “texts” rather than “books”. I completely agree with that point. But the fact that literature is delivered through a new medium does not change the essence of the subject one iota. Even Computer Science itself, according to its chief advocate the Royal Society “has longevity (most of the ideas and concepts that were current 50 or more years ago are still applicable today), and every core principle can be taught or illustrated without relying on the use of a specific technology”. Computers have not even changed the discipline of Computing and I see no evidence that they have changed the fundamentals of any other subject either.

3. The term is redundant. We have never required an abstract noun to refer to IWBs, Bunson burners and poly-gyms – why do we need such a term now, other than to smuggle in your very contentious theory that all these subjects have been changed by technology?

ICT

If we do not need “embedded technology”, then neither do we need “ICT” as a new umbrella term. Even laying aside all the (so far unanswered) arguments that the Royal Society, the DfE and I have put against this term, its use in the context that you are suggesting would be highly misleading, considering that it has always been used up to now (by yourself and everyone else) to refer to the National Curriculum subject.

Conclusion

So the taxonomy that I think we are heading towards is somewhat different to that which you suggest. I suggest the following:

  • Computing
    • Computer Science
    • IT
    • Digital skills
  • Education technology

As for the hardware and software — what you call “Digital technology” — I suspect that we will end up with a mish-mash of terms like “subject-specific technology”, “IT infrastructure”, “the network” etc. The danger, which I am not sure how to avoid, is that we use “IT” to refer both to the hardware and software being used, and on the other hand to the subject. For that reason, I think the use of “digital technology” is not a bad idea. But I am not convinced it will ever catch on. Nor do I think that the lack of a formal taxonomy in this respect will do much harm, mainly because “IT” as a curriculum subject is itself an umbrella term, which I suspect will be little used. In practice, people will be studying “digital art”, “digital games creation”, “network administration”, “database architecture”, “web design” etc.

Crispin.

Crispin Weston01:30, 5 May 2013

Crispin, thank you for your comprehensive response to my suggested definitions. To make it feel less daunting (and easier for folk to pick up particular aspects of the discussion) I am going to respond in separate messages to each of your key points.

Crispin said:

I agree with the importance of defining terms (I was arguing this back in January 2012), but not at all with the taxonomy that you propose. I also disagree with your last swipe at Gove, regarding unintended consequences. If you are talking about the clarification of definitions, this was very explicitly the point of the whole exercise that was started by “Shut down or restart?” — stuff about “damaged brand” is just PR speak for those who do not understand the details of the argument about definitions. If you are talking about adopting your definitions, I should wait a while before assuming that this is going to happen at all.

We are both starting with the same aim - to clarify the terminology used in our area. My PhD, which I started in 1990, is fundamentally underpinned by a desire to deal with the confusion about what we mean when we talk about 'educational technology'.

I agree that the Royal Society report attempted to help clarify the situation by providing definitions of key terms. However, I don't believe that Gove's motivation for changing the name of the subject in the National Curriculum is driven by that same ambition. The damaged brand argument is one that has been made by those who want the name of the subject changed from ICT (such as folk in BCS and CAS), many of who do understand the details of the argument about definitions (even if you and I might not agree with their views on them).

I agree that only time will tell which (if any) definitions relating to the key aspects of 'educational technology' (which I am taking to include everything from the technology itself through to the specialist subjects). However, part of my goal is to help ensure that some clear definitions are 'out there' and to promote their use, because (as I argue in my PhD and the dICTatEd Project) many of the problems in the field relate to lack of understanding about what we are talking about (people talking at cross purposes) because they don't define their terms clearly or use them consistently.

To be honest I am not that bothered about the particular labels that are used - so long as they are clearly defined and used consistently. So getting to agreement about terminology and definitions is the challenge ...

PeterT06:59, 5 May 2013

Hi Peter,

I have heard it credited to Kevin Riley of IMS GLC (though it may be wider practice) to say that you should first agree the definitions and then choose the terms. Instead of having a metaphysical discussion along the lines of "what does 'x' really mean", we should say "we want to discuss such-and-such, so what name shall we give it?"

So I agree with you that the terms matter only at the level of good marketing (itself not a trivial matter) - but the definitions are profoundly important in the distinctions and assumptions that they make. I think we are no more than a gnat's whisker apart on this.

On the comparative merits of "TEL" vs "education technology", I pick up on this discussion below, along with your PhD.

Crispin.

Crispin Weston01:45, 6 May 2013

I too think we are pretty much in agreement ... The best labels to use is the main challenge ... As does the substantive point re the extent to which digital technology has changed disciplines. All interesting stuff which it is good to unpack/probe.

I will respond in 'manageable' (from point of view of my available time) chunks over the next few days ...

PeterT23:33, 6 May 2013

Understood. We are a sad pair to write such a lot over a sunny bank holiday weekend - and probably the first sunny weekend of any kind in about 18 months.

Crispin Weston05:09, 7 May 2013
 
 
 

Re Digital Literacy

Crispin said:

“Digital Literacy” seems to have been replaced by “Digital skills”, precisely to avoid any implication that anyone is talking about the stuff you are promoting: “21st century skills”, “21st century citizenship” and all of that stuff, which I have argued elsewhere is bogus. It is going to be about the feet-on-the-ground (or “very narrow” as you and NAACE have it) definitions proposed by the Royal Society.

Firstly, lets be clear that when I say Digital Literacy I am NOT talking about 21st Century Skills. Indeed I nearly included a definition of 21st Century Skills in order to avoid any confusion. I have often heard politicians, business people, and educators talking in one sentence about Computer Science and in the next sentence about communication, collaboration, team work, leadership, learning to learn, real problem solving or the like, as if these things were closely related.

I have provided my definition of Digital Literacy here, and nowhere in it does it talk about 21st Century Skills (as I would define them). Of course I would agree, am arguing, that Digital Literacy is critically important in the 21st Century, but that does not mean that it equates with what most people seem to mean when they talk about 21st Century Skills (or at best it is a small subset of them).

Secondly, I agree that the Royal Society definition of Digital Literacy is narrowly defined as being about the ability to operate digital technology (what I would call 'button pushing'). I would also agree that calling 'button pushing' Digital Skills would more accurately reflect the Royal Society definition. However, I think that this is a narrower definition than the one that the group drafting the revised PoS for Computing (or ICT as it was at that time) were using (I say that with some confidence as I was part of the original drafting group).

PeterT07:28, 5 May 2013

OK. I would be interested to see your definition of 21st century skills. It seems to me that most of the key skills that we are trying to teach are exactly the same as ever they were (see my Conclusion to the article at http://edtechnow.net/2013/01/26/industrial-revolution/).

ON your 4 point definition of "digital literacy"...

1. Understanding the impact of new technologies on society, including the ways in which new technologies change disciplines (e.g. history, chemistry, English, etc)

I do not accept that new technologies are substantively changing disciplines, nor do I accept that you need to understand sociological change in order to be able to operate in society. On the contrary, there has always been a view that there is an inverse relationship between being a philosopher and being a man of the world. Finally, I think the impact of new technologies on society is highly contentious and likely, particularly when being presented to school children, to become a matter of unproven dogma. But I would be interested to see what you would expect to be taught under this item.

2. Understanding the nature of digital identities and being able to manage your digital identities appropriately 3. Being able to interact safely in a digital world (encompassing e-safety, cyber-bullying, data security, etc)

I agree with these points, which strike me as two sides of the same coin - an important matter to be covered, though relatively minor when seen from the perspective of curriculum objectives. It is broadly covered in the current curriculum by "communicate safely and respectfully online, keeping personal information private", though I would like to see this brought into KS2 in some form, instead of or as well as being in KS1.

4. Being able to locate, organize, understand, evaluate, analyze and (re)present information using digital technology (including using dynamic and procedural representations) - what you might think of as 'the creative' making and doing aspects of using digital technology (though of course many other aspects of the subject are creative too).

This is where my substantive objection lies - the concept of "Digital Creativity". You need to read the piece at http://edtechnow.net/2013/03/23/good_lord/.

I know that the drafting group came up with a definition of "digital literacy" that it thinks is broader - but I think it is a bogus definition. The Royal Society definition is IMO superior because it recognises that the requirement for digital literacy / digital skills is for the prerequisite skills ("button pushing" is a tad unfair) to enable the teaching of creativity / evaluation / analysis etc in other subjects. It is the concept of "digital creativity" that is bogus, other in the sense of coding, which members of the drafting group often seem to characterise as an activity that is only fit for some sweat shop in Bangalore.

Crispin Weston02:15, 6 May 2013

Crispin said:

OK. I would be interested to see your definition of 21st century skills. It seems to me that most of the key skills that we are trying to teach are exactly the same as ever they were (see my Conclusion to the article at http://edtechnow.net/2013/01/26/industrial-revolution/).

My definition of 21st Century Skills would encompass the 'soft skills' such as communication, collaboration, leadership, team work, real problem solving, learning to learn, etc..

I would totally agree that these are not new - they were all present in the Plowden Report back in 1967 before digital technology had made any impact on schools (or even much on society more generally).

PeterT10:59, 7 May 2013
 

Crispin said:
I do not accept that new technologies are substantively changing disciplines, nor do I accept that you need to understand sociological change in order to be able to operate in society. On the contrary, there has always been a view that there is an inverse relationship between being a philosopher and being a man of the world. Finally, I think the impact of new technologies on society is highly contentious and likely, particularly when being presented to school children, to become a matter of unproven dogma. But I would be interested to see what you would expect to be taught under this item.

I don't think I was saying you need to be a philosopher - just that it is important to understand that digital technology (in a similar but different way to the internal combustion engine) has an impact on the way society operates. It is helpful to be aware of this - for example, being aware of the ways in which the media might try to manipulate our thinking - is empowering.

I think we disagree about the extent to which digital technology has impacted on and will continue to impact on society. Offshoring is one concrete example that has real impacts on people (who either become less employable or more employable depending upon whether you are in the 'home' or 'outsource' country. I suspect that as automation becomes even more prevalent we will see radical changes in society - and will need to rethink how our economy works - see my overview of The Lights in the Tunnel for more on this.

PeterT11:06, 7 May 2013

Peter, I do not disagree that IT is having massive impacts on society - it is just that we are so much in the thick of the revolution that it is hard to make sense of it at the moment. People have long been predicting mass unemployment as a result of automation, without it ever having happened to any significant extent. As some functions are automated, people have always discovered other things that need to be done and appetites for increased consumption has proved virtually limitless. Your reference seems to me to make my point - the first line of the article is "This book explores the possibility that automation will lead to mass unemployment". It is speculative theory which I am not sure schoolchildren are well equipped to evaluate. It seems like a good topic for a sixth-form debate, not a KS3 sociology lesson.

As for the media manipulating our thinking, I do not entirely disagree but I think that there is at the same time a danger of peddling negative and what might easily become conspiratorial theories. As a History teacher, it has always struck me that we are quick to teach children about other people's propaganda, but almost completely blind to our own.

The right approach, I believe, is to try and positively equip children to think for themselves and approach all assertions with an open and skeptical mind - but this is a very tough objective and you will fail most of the time. I am with John Stuart Mill that the real threat to independent thought is not authority but the pressure to conform to social orthodoxy - a pressure that Twitter and Facebook have very substantially increased. There is perhaps another sixth-form debate along the lines of whether the Press is motivated primarily by propriatorial agendas or the popular appetites of its consumers.

My main concern is that all this is very controversial and in my experience even teachers are often not very good at thinking for themselves. This means that there is a significant danger of classrooms being used to peddle teachers' pet theories.

Crispin Weston11:49, 7 May 2013

Again we agree about much here:

  • digital technology is having massive impacts on society - but the future is uncertain (who knows what those impacts might be in 5, 10, 20 etc years time)
  • New media provide new ways for people to communicate and influence others (you didn't say this but its implicit in your partial agreement re media manipulation I think)
  • we want to equip children to think for themselves and approach life with an open mind
  • this is all complex and controversial stuff

Where we seem to disagree is that I think that despite the complexity and messiness we ought to be helping children understand it (as best we can and at a level that is appropriate to the maturity of the children), whereas you seem to be saying that because its so uncertain, complex and messy we shouldn't try to teach about it (until the children are much older).

I can see where you are coming from - I just don't agree with you on this occasion.

PeterT10:22, 8 May 2013

Peter, I am not sure we are so far apart. It is question of the level at which you address the issues - and in a way it echoes the Gove vs. Rosen set-to on language. Rosen, it seems to me, is talking about the study of linguistics, Gove is talking about the ability to study a version of English that will prove most useful to children. I am with Gove, not because the study of linguistics is not valid - but I do not think it is what will be most useful to KS3 children. There is also a danger that, in teaching university level stuff at KS3, you end up with a platitudinous mush.

On the impact of computers on society, I think dealing with peer pressure on Facebook and protecting confidential information are critical issues which at the moment are only dealt with at KS1 and ought to be promoted further up the curriculum (at least to KS2, which is where - though I have not taught at this level - I would have thought children would be adopting an online persona for real).

As for "thinking for yourself", this is a much more fundamental issue, part of the core academic curriculum, led by analytical subjects such as History. The way I would see this working is that Computing would give children the prerequisite digital skills (as defined by the Royal Society) to navigate the web and manipulate web resources, and let the History teacher (perhaps with cross-curriculum support from the Computing teacher) run a "contemporary History" project, showing the similarities between the skills required to evaluate Nazi propaganda with the ability to evaluate some of the stuff that you are likely to find on the web. A "contemporary issues" discussion or debate might take this forward by questioning whether the internet revolution has brought benefit or harm to the way information is circulated and accessed.

So on the particular case of the effect of IT on society, I am all for stimulating debate and intellectual curiosity and pointing out the relevance of abstract learning to the here and now; and I am also all for helping students lead their own lives at a PSHE level. What I am against is making the computer's effect on society a hard curriculum topic, any more than we should get KS3 children writing about euthanasia, abortion and homosexual marriage. It's all good stuff, maybe for an A level ethics or sociology course - but I just think that at KS3 it will hit the platitudinous waffle trap.

On the general question of cross-curriculum activity, I am all for finding synergies. But I am against Computing hijacking creative / evaluative / analytical learning which is properly the domain of other subjects. It should rather deliver the prerequisite digital skills to enable those subjects to address their traditional subject matter, working through digital media.

Crispin Weston12:40, 12 May 2013

In an ideal world I would agree that we should be expecting 'the <insert subject> teacher' to address the big issues in their subject, whilst the Computing teacher has ensured that the children have the necessary level of digital literacy to be able to use digital technology to help them do so. However, I think that the competences required to use digital technology go beyond the ability to operate the technology, and incorporate what in the past we might have called ICT Competence (a broad set of knowledge (i.e. the ability to apply information), understanding and skills related to digital technology). Furthermore, whilst learning digital literacy 'in situ' (e.g. across the curriculum) would be better than having it as a discrete subset of Computing, in practice at present if it isn't explicitly specified in the PoS then there is a serious danger that it will not be addressed (or at best will be addressed inconsistently by different teachers).

PeterT02:18, 13 May 2013

Repeating what I have just written on another thread, I think the difference is between curriculum aims and pedagogy.

One of the great benefits of the new Computing curriculum, which I have been arguing for for a long time, is the disentangling of the teaching of technology and the use of technology to improve teaching. I think what we are discussing here is part of the second of these points. I don't think that technology changes the fundamental other subjects - but it does change the way that it can be taught and the fundamental learning contextualised.

Why do teachers not read the academic literature or research evidence? How do we stimulate a more vigorous debate (1) on pedagogy as a "design science" and (2) on ways in which technology can help that process? How do we stimulate a pull dynamic, rather than always relying on a government funded push? For me, those are the key questions.

Crispin Weston04:00, 13 May 2013

We clearly disagree about the extent to which digital technology impacts on disciplines and should therefore impact on school subjects. Whilst we agree that what I've been calling TEL shouldn't be the focus of Computing (the subject).

However I think that digital literacy (my broad definition rather than the narrow technical skills definition) is different to both of the above things and does need to be taught as a discrete subject for reasons already set out in other posts in this discussion.

It seems to me that digital literacy is more important than computer science on the basis that everybody needs to be digitally literate whilst only a minority of folk need to be computer scientists.

PeterT02:27, 14 May 2013
 
 
 
 
 
 
 
 

Re Technology Enhanced Learning

Crispin said:

“TEL” should be avoided for reasons I have already described. If you do not want to read the whole article, my central point is illustrated by the definition you propose. You talk only of the use of digital technology, even though the subject-specific technologies that we need to transform education do not generally exist. This has been the central reason why the whole Becta experiment failed and why the sort of training programme that you were delivering through Vital was always premature (as I suggest to you when we first met, at the time that it was first being established). We should use “education technology” instead, because it covers both development and use of such technologies.

I have to admit that I don't like the term TEL. I have gone with it because it is the term that most people in the field use - though they tend to use it to refer to all aspects of the cross curricula use of digital technology (what I have called ICT and suggested should be sub-divided into Embedded Technology and TEL).

Ignoring the label for a minute - if we agree that digital technologies afford us new pedagogical strategies/techniques then I think it is useful at the moment to have a term to refer to that. I agree that this is about pedagogy (and thus teaching) and thus TEL's focus on learning is an issue. I disagree about the point Crispin makes about subject-specific technologies (in relation to pedagogy) because I want a term the encompasses the impact of all digital technology on pedagogy, not just ones specifically designed for education. So, for example, TEL (or whatever better term emerges) encompasses the use of Tablets (a consumer device) on pedagogy. At least one school that we have collected data in is specifically concerned with using Tablets because they are a consumer device and not part of 'school technology' - and the impact that is having on pedagogy is important.

PeterT07:37, 5 May 2013

I am interested that you agree about the need to focus on teaching and not just on learning. It is good to have some company in an unfashionable position.

Peter wrote: "I want a term the encompasses the impact of all digital technology on pedagogy, not just ones specifically designed for education".

My point is that it is not so much the effect of technology on pedagogy, but the effect of pedagogy on technology that should be concerning us very much more than it has in the past. Our neglect of this point is why we have got ourselves into a position where, as Diana Laurillard puts it, "what education has done has been to appropriate everybody else’s technologies for all the different facets that we need in the teaching and learning transaction" - see my post at http://edtechnow.net/2012/01/25/aristotles-saddle-maker/.

It also reflects a misunderstanding of what technology *is*. It is not a commodity, a large jar of peanut butter to be bought at the supermarket and spread over everything. It is an opportunity to innovate - and one that we have not taken. I have only had time to skim your PhD - but I think that I would make the same criticism of what I see there. e.g.

<<Twining’s (1999) critique of Laurillard’s (1996) Media Mix Model illustrates that it, like all software frameworks, suffers from the problem of technological determinism. Software and other technologies have what Laurillard, Stratfold, Luckin, Plowman and Taylor (1999) describe as affordances, that is they lend themselves to being used in certain ways. However, that does not preclude them from being used in other ways, which were not intended or anticipated by their designers. (p 356)>>

This reminds me of the Microsoft advert of a technician using a knife as a screwdriver, captioned something like "use the right tool for the job". It is not a virtue to have to bodge in this way, but a symptom of a failure to innovate.

And the whole purpose of the PhD is to create framework along the lines of another which you describe as "consisting of seven dimensions, which they state can be used to describe progress in implementing/embedding ICT within schools (p356)". This falls into the trap of regarding technology as a "given" that it is for teachers to apply to education, rather than fostering a supply chain in which the adaption of technology to education can be led by industry, responding to teacher demand.

<<Cloke (2000) argued that even though there has been “extensive research into the use of ICT in schools, relatively little research has focused on the key pedagogical issues.” (p.1). This suggests a continued focus on technological issues, despite calls “to expand our concerns to include pedagogical, as well as equipment problems (p354)>>

Again, re-enforcing the unhelpful dichotomy, "never mind the technology, what about the learning". The point is not that we should ignore the technology but we should make sure that the technology *serves* the pedagogy. And that means developing education-specific technologies (normally software) on top of more generic infrastructures - see my http://edtechnow.net/2012/01/25/aristotles-saddle-maker/ on this point.

Crispin.

Crispin Weston02:34, 6 May 2013

Again I think we are pretty much in agreement - I totally buy the idea that we should not be content with simply making do with technology developed for other audiences/purposes - we should have some technology designed specifically for education, and it would make total sense for that to be called 'educational technology'.

I also agree that the technology should not be the driver of pedagogy (or learning).

However, I don't see that these arguments change the fact that digital technology (whether or not it was designed with education in mind) can and does change our pedagogical possibilities. I think we need a term to refer to that, because having terms helps people focus on things which they might otherwise not notice - and, simply because the term is well established, I have gone with using TEL.

PeterT10:30, 8 May 2013

Peter, I agree on the importance of terminology and I can also see the need for a term to represent the effect of technology (education-specific or not) on pedagogy. I just think TEL is not a satisfactory term.

I guess that what this boils down to is that I do not think that non-education-specific technology *has* had much of an effect on learning in formal educational environments.

The internet and other generic software is great for:

  • disseminating information;
  • social networking;
  • games and simulations;
  • creative tools.

The problems have been that formal education is not about accumulating information (bullet 1); and that children are not motivated to use generic social networking tools for learning (it can be done, I agree, but I have yet to see an example that has struck me as being worth the effort) - bullet 2. There is, I believe, great potential for education-specific social networking environments - but they have not been developed yet. Similarly with games and simulations - generic commercial games are (pace Graham Brown-Martin and Ian Livingstone) next to useless and the education-specific, serious games have not yet been developed - bullet 3. As for creative tools, word processing software is probably the most useful contribution to date in the whole of education technology to formal education. Again, there is great potential for education-specific tools but so far they have not been developed (my own subject is History: why has no-one developed a timeline editor or a causal mapping software?) - bullet 4.

I would go further than this and say that the internet has broadly had a negative effect on education. Its most common use is for "internet research" which in reality involves cut-and-paste plagiarism, with lazy teachers failing to understand that research is not about the accumulation but the processing of information. Desktop publishing and other presentational software has also, under the guise of making students feel good about themselves, led to a massive waste of time in school. Yet "TEL" implies that technology always represents an enhancement.

So my analysis is that the whole project has failed due to the lack of education-specific technology that is supportive of good pedagogy - yet most of the TEL community do not even recognise the absence of education-specific technology - it is the elephant in the room - and the TEL acronym allow them to continue not to see it.

Another problem is the poor liaison between academic educationalists and teachers: the contrast with doctors, who are held responsible for keeping up to date with research journals, could not be starker. Again, by focusing on "learning" rather than "teaching", TEL helps perpetuate the view that the teacher has an incidental role in education, merely as facilitator.

Instead of leading teachers to expect that you just stir in to teaspoons of technology and out comes enhanced learning, we need to focus on the need for good teaching.

So while I recognised at the top of this comment the need for another acronym with the definition that you propose, I would suggest "digital pedagogy" rather than "TEL", to cover the practitioner's contribution to the party, as a companion to "education technology" which focuses on suppliers' contribution. My post today at http://edtechnow.net/2013/05/12/pedagogy/ explains what I understand "pedagogy" to mean in practice.

As a final aside, I was always amused by the forward to Becta's first Harnessing Technology report, in which Charles Clarke encouraged us all to "embrace the new pedagogues". I imagined a wave of government-approved sexual harassment breaking out in schools up and down the country. But notwithstanding the Malapropism, I thought the basic intention was a good one.

Crispin Weston13:49, 12 May 2013

I would be happy with having a different acronym to TEL, but don't think digital pedagogy works because it foregrounds the digital (the pedagogy is digital). This foregrounding of the technology within TEL is less problematic, because the technology is only enhancing the learning. However, I'd be very happy if we came up with a better term ...

I don't buy the argument about 'because technology hasn't been used well by teachers' then technology (unless designed specifically for education) can't impact on pedagogy. I have seen many instances of digital technology which were not designed for education being used very effectively in ways that change the pedagogy (though I wouldn't disagree that much or even most of the use of technology we see in schools isn't making much difference to the pedagogy - or is making it worse). However, I'm happy to disagree on this one. :O)

PeterT02:25, 13 May 2013

Just to reiterate my argument, I think the failure to foreground the technology is part of the problem because technology is not a given and we need mechanisms which ensure that teachers get their hands on the right technology.

But I guess we've probably run our course on this one, at least at an abstract level, and to go any further our remaining disagreements probably need to be discussed in relation to particular concrete issues. Perhaps we may have a chance to do that some time soon.

Thanks for the discussion.

Crispin Weston03:45, 13 May 2013

One thing I would add ...

I once led a TLTP3 project called SoURCE (Software use, re-use and customisation in education) which aimed to explore the extent to which you could embed good pedagogical design within software. One of the things we found out that was even where technology (software in our case) had been specifically designed for pedagogical purposes it could be (and was) used in ways which totally undermined the embedded pedagogical model.

The point being that it is not the technology that is the issue, it is how it is used.

So, whilst I agree that it would be good if there were more educationally focussed digital technology (ie technology designed specifically for education), this will not solve the problem of pedagogically sound use of technology ...

PeterT01:42, 14 May 2013

I agree that software might be used in ways not originally intended - but I do not see that it follows that educationally-focused software "will not solve the problem of pedagogically sound use of technology". Just because you can have a tin which says "just add pedagogy", does not mean that it might not be better to have a tin which says "comes with pedagogy inside". And the fact that you *can* bodge stuff together does not mean that you *should*. And if you find the need to undermine the embedded pedagogy of the original software, does this mean that the original software was not well designed or was inappropriate to the intended purpose?

It partly depends on the sort of software you are talking about - a point that I addressed in my post "Aristotle's saddle maker", at http://edtechnow.net/2012/01/25/aristotles-saddle-maker/. Some software is very generic, some is very application-specific. You could not, for example, used a system designed to handle point-of-sale transactions for doing anything much other than handling point of sale transactions. Whereas a word processor or web browser can be used for all sorts of things. In an educational context, it is the application-specific software that is lacking: at a systems level assignment managers, common markbooks, e-portfolios, learning analytics; at the instructional level, subject-specific creative tools, serious games etc.

Part of the problem with ed-tech is (a) most teachers are not technologically confident, and (b) most teachers are not even very pedagogically confident - they do not read the research literature and I doubt whether there is even general agreement about what pedagogy *means* (see my most recent post - more of a draft than a finished piece at present - on "Five principles of pedagogy" at http://edtechnow.net/2013/05/12/pedagogy/). So what has happened is that ed-tech has proceeded as a kind of local boy scout modelling club - lots of string and sellotape and enthusiasm which *hasn't* really been very infectious. What I am arguing is that we need the education-specific software that works out of the box, puts good pedagogy in the classroom and does not depend on the local teacher to reinvent another bodged-up wheel.

Can you point to a write-up of the SoURCE project which you are referring to?

Crispin Weston02:57, 14 May 2013

Both the SoURCE website (www.source.ac.uk) and the OU's Knowlege Network (where many of the outputs from SoURCE are located) are currently unavailable. Check out [[1]] and search for SoURCE ...

Our experience was that even with software specifically designed to embed pedagogy, and where we explained how to use the software to teachers (actually lecturers in HE), they often undermined the pedagogical model.

PeterT07:43, 25 May 2013

Hi Peter, Sorry for slow reply - only just found this comment.

It is a little hard to respond in detail, (a) because of the scarcity of project outputs and (b) because of the amount of time it would take. However, working from the SoURCE overview at http://kn.open.ac.uk/public/getfile.cfm?documentfileid=2220, I would make the following comments.

1. Customising content was the objective of the project - so it is not very surprising that it found that it *could* do what it set out to do. It does not sound as if the project made much attempt to assess the extent to which such customisation was found to be necessary or desirable by teachers and lecturers.

2. The extent to which software needs to be customised will depend to some extent on the quality of the software - so an assessment of the significance of the project outcomes will need to start with an assessment of the quality of the software being used, and how well it met its original requirements.

3. That said, I think the principle of adaptablility is a very important one. Let me propose a difference between "customisation" and "adaptability" on the basis that the first incorporates a subversion of the original intention of the software and the second does not, but represents the application of the software to a variety of different contexts. So your paper quoted above says:

"Thus, for example, you can customise the Elicitation Engine by changing the artefacts that it is manipulating and/or by using it as a reflective tool for students, or as an assessment tool for staff to identify students’ misconceptions".

I haven't worked out what the "Elicitation Engine is yet - but it seems clear to me that the two ways in which you are changing the application of the software represent my "adaptability" and not a subversive "customisation". "Changing the artefacts that it is manipulating" represents the application of the *same* encapsulated pedagogy to a different subject area - this strikes me as an essential feature of any pedagogy-encapsulating software. Second, the use of the Elicitation Engine as either a "reflective tool for students or as an assessment tool for staff" boils down to different ways of using the tool and its outcome data in a wider ecosystem. This too is an essential characteristic of a digital ecosystem built on open interoperability standards, that the student and teacher can play lego with their software components, modelling different pedagogical processes at the macro scale, by different combinations of pedagogy delivered by different software applications at the micro scale.

In short, neither of these examples seem to me to represent the customisation of encapsulated pedagogy in way that is subversive of the original intention of the software.

One further point. Customisation by tweaking program code or using software for a purpose that was not intended is likely to be difficult and cause confusion - particularly when deployed in a class of 30 who are bound to find out any flaws in the software that exist. The example that I quote from your paper illustrate the two principles of *adaptability* which I think are essential:

1. adaptability by parameterised launch (in this case, providing a different list of resources) with parameters being specified in user-friendly interfaces;

2. adaptability by different selection and combination, with these "sequences" and other aggregations of content being created in easy to use, drag-and-drop authoring tools.

Neither of these principles undermine - but rather enhance - the value of software that encapsulates pedagogy.

Crispin.

Crispin Weston02:03, 4 June 2013

You are quite correct Crispin that customisation within SoURCE did NOT involve changing the pedagogy embedded within the software. The ability to customise the software for use in different contexts (e.g. with different artefacts for 'sorting') was part of the software design. This was not where the problems occurred.

The problems were when someone came to implement the use of an instantiation of software that had already been customised. For example: Elicitation Engine (EE) Shell - customised by adding some artefacts => An instantiation of the EE. This then gets used in some teaching context. It is at this point that the pedagogy is undermined - for example by the 'teacher' telling the students what categories to use to sort the objects rather than expecting them to come up with their own categories.

This sort of undermining of the pedagogy that was designed into the software occurred frequently in our experience, even when the teachers had engaged in professional development that was intended to help them understand how the software was designed to be used. They used it in ways that fitted with their existing pedagogical practice by and large - rather than the pedagogical practices designed into the software.

PeterT00:19, 17 June 2013
 
 
 
 
 
 
 
 
 
 

Re Embedded technology

Crispin said:

I disagree with your use of this term for three reasons.
1. As already argued, we “embed” things when they have a different purpose to the surrounding environment (a reporter in an army unit, for example) but we do not embed things which serve the main purpose of the surrounding environment. We do not “embed” tables and chairs in classrooms and neither should we “embed” weather stations in Geography lessons or MIDI instruments in music lessons. I think you understand this really: the problem is not that you use the term “embed” incorrectly but that your justification hides your true purpose. You really want to introduce different aims into other subjects under the pretence that the subject has changed in ways that you understand better than native subject experts. Your use of this term therefore signals your desire to act the cuckoo, laying your agenda in everyone else’s nests.
2. In my view, you have grossly over-stated the claim that technology is changing the nature of the rest of the curriculum. When you look at the substance of the argument that you have made, it boils down to the fact that the English curriculum should refer to “texts” rather than “books”. I completely agree with that point. But the fact that literature is delivered through a new medium does not change the essence of the subject one iota. Even Computer Science itself, according to its chief advocate the Royal Society “has longevity (most of the ideas and concepts that were current 50 or more years ago are still applicable today), and every core principle can be taught or illustrated without relying on the use of a specific technology”. Computers have not even changed the discipline of Computing and I see no evidence that they have changed the fundamentals of any other subject either.
3. The term is redundant. We have never required an abstract noun to refer to IWBs, Bunson burners and poly-gyms – why do we need such a term now, other than to smuggle in your very contentious theory that all these subjects have been changed by technology?

I basically agree with the first point - Embedded Technology (as I define it) is about something that is integral to the discipline and thus becomes invisible, its not an add on. I think Crispin mis-reads my motivation - but we will come to that in a moment.

I agree that the subjects taught in schools have not been fundamentally changed by digital technology (Crispin's item 2). However, I believe that the disciplines outside schools have been transformed by digital technology - the things that people in 'real world' (meaning the world outside school) are different because of digital technology, the sorts of questions they can ask and the ways in which they can try to answer them and represent their answers are radically different. I think that this is really problematic - the subjects taught in schools should bear some relationship to the real world disciplines. Thus my motivation, and the reason why I think we need the term Embedded Technology, is to flag up the ways in which digital technologies have changed the nature of disciplines and should therefore impact on the school curriculum.

The specific point about putting 'text' rather than 'book' in the English PoS reflects a desire to operate in the real world - if one suggested more radical changes to the draft PoS then you would be totally ignored - so its about trying to make changes which appear trivial but at least open up possibilities and might have some chance of being implemented.

I find the claim that 'computers have not even changed the discipline of Computing' difficult to grasp - firstly, I think the discipline is called Computer Science (at least that is what the BCS, CAS, RAEng have been calling it. Secondly, (and not withstanding the first point), if it weren't for computers there wouldn't be a discipline called Computing ...

In response to comment 3 that the term is redundant - if it were the case that school curricula reflected their related disciplines then I would agree with you. However, until that is the case there is a need for a term in order to highlight the issue and give us a way to talk about it meaningfully.

PeterT07:50, 5 May 2013

Peter,

Maybe I was sloppy in talking about "Computing" rather than "Computer Science" in reference to the thing that hasn't changed. The latter is an essentially logical process and is prior to computers themselves. You could study it, for example, using a hypothetical Turing Machine. It is basically Maths and logic and that does not change. It is not me who says that, it is the Royal Society and they should know.

While I support you on "texts", I do not see how this affects the essential subject, which is either about literature (e.g. D H Lawrence's approach to industrialisation) or language (the use of devices such as metaphor and assonance, different rhetorical registers etc) - none of this is affected by whether you are reading from paper or a tablet.

I accept that the nature of *work* may have changed but this does not mean that subjects have changed - except when you move towards vocational training. But perhaps you could give some examples of how you think technology has changed subject disciplines.

I will accept (what may be different sides of the same coin) that:

(a) computers have changed the ways in which abstract knowledge is applied, new ways of working; (b) computers have created new disciplines like computer modelling; (c) computers may have changed what we think is worth learning; (d) computers may have enabled us to discover new things - they have provided new sources of evidence...

...but I maintain that all of these changes are in fact fairly peripheral. The core knowledge of Maths and English, History and Science, problem solving and teamwork, analytical thought and creative endeavour, have hardly changed at all.

Perhaps you could give some examples of how you think computers *have* fundamentally changed subject disciplines?

Crispin Weston12:11, 7 May 2013

Hmm - I struggle with the notion that the discipline of Computer Science exists separate from computers today. I recognise that the logical processes that underpin Computer Science and some of the early (mechanical) computational devices pre-dated computers, but at some point the discipline presumably has expanded beyond the underlying maths and logic as a result of digital technology. Isn't that so? Could you teach Computer Science without reference to computers today?

PeterT10:36, 8 May 2013
 

I am intrigued by your argument that the changes that digital technology have made are "peripheral" to the core of disciplines.

It seems to me to parallel the debate about whether digital technology has changed how we think and/or learn. In that debate I tend towards saying that digital technology hasn't fundamentally changed how we think or learn - it may be that some of our 'cognitive muscles' have become flabby from lack of use, and other 'cognitive muscles' have got stronger. However, there may come a point, and this is where I struggle, at which the unused 'cognitive muscles' atrophy, and when that happens then something has changed (cos those 'cognitive muscles' no longer exist, they can't be revitalised by future use).

I do wonder what the point of a discipline is - and I guess I would argue that it is about a way of seeing and engaging with the world. If that is the case, and as you have agreed, digital technology has changed the ways in which we apply knowledge, work, think is worth learning, provided new sources of evidence (and allowed us to ask and answer new questions) then hasn't the discipline changed (even if some of the fundamental 'knowledge' has remained constant?

I'd love to hear from other people who are experts in other disciplines on their views about whether or not digital technology has changed their discipline. My strong feeling is that it has, whilst agreeing that some fundamental principles (e.g. 1+1=2) have not changed.

PeterT10:49, 8 May 2013

I think we are very close on this one - and I agree that the general discussion is an interesting one. I think it also parallels an educational discussion about the point at which you move to vocational courses. If it is the application of subject knowledge rather than its fundamental principles that have changed, then too much emphasis on technological application may represent a premature move to vocational training.

At the same time, I think that abstract principle needs to be contextualised in a variety of challenging and compelling ways. E.g. what maths was used to work out the mass of the Siberian meteorite? My only quibble is that this is a function of pedagogy and not curriculum (i.e. top level learning objectives).

I echo your call for more contributions to the discussion!

Crispin Weston03:52, 13 May 2013
 
 
 

Re ICT

Crispin said:

If we do not need “embedded technology”, then neither do we need “ICT” as a new umbrella term. Even laying aside all the (so far unanswered) arguments that the Royal Society, the DfE and I have put against this term, its use in the context that you are suggesting would be highly misleading, considering that it has always been used up to now (by yourself and everyone else) to refer to the National Curriculum subject.

As argued previously, I think we do need to be able to talk about the ways in which digital technology is used across the curriculum, and to distinguish between the use that is integral the subject content (or should be if the subject relates to the real world discipline) and the use that is about teaching strategies/techniques. So having three terms is useful: one relating to the impact of digital technology on the subject/curriculum (what I have called Embedded Technology), one relating to the impact on pedagogy (what I have called TEL) and a collective term to cover both (which I have called ICT).

The fact that ICT has historically been used to cover the cross curricula use of digital technology, and is still used in most of Europe and places like Australia in part explains why I think it is foolish to jettison it completely.

PeterT08:00, 5 May 2013

I do not think you have dealt with my challenge to your assumption that technology has changed core subject disciplines. A music teacher may wish to use MIDI software as a means of teaching music - but the aim of the subject is not about learning about MIDI software, but rather about understanding the nature of music.

The call for subjects to "reflect real world disciplines" (a little like the call for "authentic" learning) strikes me as risking bringing ephemeral, vocational training into the classroom, and failing to address the abstract transferable understandings that will equip children to navigate the rapidly evolving technological landscape. These are good teaching techniques but they do not impact curriculum aims.

Crispin Weston02:41, 6 May 2013

I'd refer you back to one of my other responses.

PeterT10:50, 8 May 2013
 
 

Re Conclusion

Crispin said:

So the taxonomy that I think we are heading towards is somewhat different to that which you suggest. I suggest the following:
  • Computing
    • Computer Science
    • IT
    • Digital skills
  • Education technology

We really aren't that far apart ... I think that Digital Skills implies a narrower definition that I think we need - and whilst it reflects more accurately how the Royal Society defined Digital Literacy it is not the term that they used.

We seem to be agreed that there needs to be a term to refer to the cross curricula use of digital technologies - I am reluctant to introduce yet another new term (such as Educational Technology) though if you have been following my bliki you will have noticed that I did use that term previously when ICT was being used to mean the subject. Now that ICT doesn't mean the subject I think it is better to stick with it meaning the full range of cross curricula use (because that is what many people have meant by it in the past). I also think it is useful to be able to distinguish between different facets of cross curricula use - in terms of impact of digital technology on the curriculum/subject content and on pedagogy.

PeterT08:25, 5 May 2013

I agree that the Royal Society did not use "digital skills" - but as described in my blog post, the DfE *is* now showing signs that it will use this phrase to refer to what the Royal Society called "digital literacy". It is a way, I suggest, of drawing some clear blue definitional water with what you call "digital literacy".

The choosing of terms is, as I said at the top, a matter largely of marketing - and continuity of usage is very important. If you decree that "pig" should be used to refer to what everyone up to now has called "cow", then you are going to cause a lot of confusion. "TEL" is very little used in schools, being mainly used in HE. "Education technology" is widely used in the US, where "TEL" is virtually unheard of - and "education technology", if you take my definition of it, is going to be an international market, dominated (if current anti-market, anti-innovation attitudes in the UK persist) by US suppliers. So we might as well get used to it.

And the main reason why we have got rid of "ICT" is that its meaning is poorly defined, a problem that would be compounded by continuing to use it for a new meaning. I think the DfE's footwork in this respect gives us an object lesson - if a term is tainted or confusing, drop it and use another, do not fight for it - it is a waste of time.

I do not define "education technology" as having anything to do with the "cross-curricular use of technology". This makes the same mistake as the term "embedded technology" - it suggests that education technology is something to do with the curriculum - and this lies at the heart of the confusion. It is nothing to do with the curriculum. You do not talk about "embedded tables and chairs" and you do not talk about "cross curriculum tables and chairs".

Crispin.

Crispin Weston02:53, 6 May 2013

Much of this has been responded to in other posts so not reiterating those points here.

Re ICT - what I am proposing is not giving ICT a new meaning. ICT has in the past been used to mean three things (the subject, cross-curricula use of digital technology, and the digital technology itself). What I am proposing is that we focus the definition of ICT so that it only means one of these things - namely the cross-curricula use of digital technology. So far from creating confusion it should help to create clarity.

PeterT10:55, 8 May 2013

But I think the conflation is deliberate - it is part of the "ICT" brand that you improve learning by teaching a set of digital learning skills which supports independent and peer-mediated learning, de-prioritising the "knowledge-based curriculum". And the evidence is that that is not true or helpful.

Crispin Weston05:51, 13 May 2013

In which case you presumably are in favour of de-conflating ...

I think we also need to move away from a dichotomy between 'knowledge' and 'skills' - neither can operate in the absence of the other. Knowledge, as I understand it, is the application of information. Skills have to be applied to something.

PeterT07:39, 25 May 2013
 
 
 
 

Re Digital technology

Crispin said:

As for the hardware and software — what you call “Digital technology” — I suspect that we will end up with a mish-mash of terms like “subject-specific technology”, “IT infrastructure”, “the network” etc. The danger, which I am not sure how to avoid, is that we use “IT” to refer both to the hardware and software being used, and on the other hand to the subject. For that reason, I think the use of “digital technology” is not a bad idea. But I am not convinced it will ever catch on. Nor do I think that the lack of a formal taxonomy in this respect will do much harm, mainly because “IT” as a curriculum subject is itself an umbrella term, which I suspect will be little used. In practice, people will be studying “digital art”, “digital games creation”, “network administration”, “database architecture”, “web design” etc.

I fear you are right - but would like to avoid that, which is exactly why I have suggested the term. Whether or not it catches on depends upon whether sufficient of us use it consistently ...

PeterT08:27, 5 May 2013