On a number of occasions recently, I have been struck by the basic ignorance of many professional medievalists concerning developments in computing that affect their daily lives.
By this I don't mean the latest revisions of TEI P5 and how this will affect the markup of unclear textual variants. Rather I mean much more basic issues such as what XML is and how it is different from HTML or Word; or the relationship between fonts and characters.
This has been striking to me because it is affecting basic aspects of these medievalists' work. You don't have to be a 'computer person' to benefit from Unicode, for example, and I have been surprised how many often quite complicated projects seem to be completely ignorant of fundamental aspects of humanities computing that would greatly simplify their work.
So my question is this: should there be a basic humanities computing course required of beginning graduate students similar to the research methods course so many universities require or recommend? What should such a course contain? Looking around on the internet, I tend to see two types: very practical courses that concentrate on using current software and techniques (e.g. Macromedia, Excel, Word, etc.) and quite advanced courses that focus on the trivia of building standards compliant editions. I'm wondering if there perhaps is not a need for a more conceptual approach: what is Unicode? what is XML? structural vs. display markup? how the web works. database design.
I realise these are not topics that speak to many graduate students in our field--who are, after all presumably interested in their disciplinary subject. But given how many projects now include an electronic component, it seems to me that we may be rapidly approaching the time when a medievalist should be expected to have a basic conceptual grasp of contemporary humanities computing.
What do others think? I'm thinking of asking the same question on Medtext and, phrased more generally, on Humanist. But before I do, I thought I'd ask here. I suspect the issue is more important for medievalists and classicists than it is for scholars in more modern periods: more of us seem to work more closely with things that can be digitised. Or is that discipline prejudice?
-d
-- Daniel Paul O'Donnell Director, Digital Medievalist Project (http://www.digitalmedievalist.org/) Associate Professor of English University of Lethbridge Lethbridge AB T1K 3M4 Tel. +1 (403) 329-2377 Fax. +1 (403) 382-7191
(@webmail)
Brief reactions...
Quoting Dan O'Donnell daniel.odonnell@uleth.ca:
What do others think? I'm thinking of asking the same question on Medtext and, phrased more generally, on Humanist. But before I do, I thought I'd ask here. I suspect the issue is more important for medievalists and classicists than it is for scholars in more modern periods: more of us seem to work more closely with things that can be digitised. Or is that discipline prejudice?
No, I think that's true. We ancient and medieval types have probably also been involved with technology for longer than, say, modern history or literature, or philosophers, for the very same reason. Paradoxically, we have more than our pair share of Luddites (meant in the most respectful way possible--in the sense of traditional philologists who believe that using the Thesaurus Linguae Graecae database to look up a word, or the Perseus digitised Lewis and Short dictionary to gloss a meaning, would weaken them, would cause them to lose their essential mental powers and become no better than the students they scorn for doing so). These Luddites are not necessarily older generation scholars, by any means: I know plenty of young scholars from Oxford or Harvard who cast themselves in precisely that mould, partly to emulate their mentors, no doubt, and partly to show off that they _do_ have these traditional skills. Much like the peacock's impractical tail, I suppose, that shows the male bird is strong enough to survive with an inconvenient physically burden.
Ahem--random weekend digressions aside--I agree that it is very important that graduate students be exposed to the technologies that help to shape the development of their disciplines. Should this teaching be delivered by and as part of the medieval/classics courses? Surely. Do medieval and classics departments have the resources and experience to teach such courses? I'm not sure. If not, are there enough good Humanities Computing centres to take on the responsibility? It is essential to be having this discussion, and I thank Dan for opening it, and taking it to other fora.
(Not so brief after all...)
Best,
On Sat, 20 Aug 2005, Dan O'Donnell wrote:
I'm wondering if there perhaps is not a need for a more conceptual approach: what is Unicode? what is XML? structural vs. display markup? how the web works. database design.
As a graduate student who is both a medievalist and a technorhetorican (I'm even on the CCCC Committee on Computers in Composition and Communication), I would say yes, all humanities graduate students need an introduction to humanities computing, and I've long been thinking that the conceptual approach you suggest is a good way to start.
As we all know, digital tools are both changing the way we do work in the humanities and they are creating new methods, theories, practices, and opportunities. I'd argue, actually I have been increasingly arguing both locally and in other forums, that to ignore the issue of humanities computing and the increasing role of digital technologies and digital culture in graduate education is quickly shifting from being an issue of not being on the cutting edge to being an issue of negligence on the part of that program. In other words, it's rapidly becoming not an issue of humanities computing and the disciplinary subject, but humanities computing becoming one of the various methods and practices of engaging the disciplinary subject.
But I'd go farther too (this is the technorhetorician and the Ongian in me). It shouldn't just be about digitizing material, but also the production and consumption of native digital texts, and understanding of digital culture, digital noetics and practices, and the logic of new media. In other words, not just how to digitize primary sources, but how digital technologies can change the way we do scholarship.
But I'd go farther too (this is the technorhetorician and the Ongian in me). It shouldn't just be about digitizing material, but also the production and consumption of native digital texts, and understanding of digital culture, digital noetics and practices, and the logic of new media. In other words, not just how to digitize primary sources, but how digital technologies can change the way we do scholarship.
For instance, how can the logic of new media -- the cut-up, the mix and remix, juxtaposition, association and linkage, to name a few -- change the ways we can make arguments, explore our subjects, and share and preserve information? In what ways might the mediated experience of a virtual recreation of an archeological dig change the way archeology is done (for one, would the added financial and physical constraints of creating a real-time virtual reality model of the dig outweigh or be outweighed by the possibility of future archeologists (or the original archeologists) reexploring a dig in much the same way architects create virtual reality models to "walk" through? Or how does our understandings of digital culture help us rethink our understanding of past cultural processes a la orality and literacy studies and book history? Or, for that matter, how can our understanding of earlier cultural processes help us understand digital ones (see, for instance, John Miles Foley's Pathways Project, or the work being done in textual and bibliographic studies).
John
John Walter | walterj@slu.edu Ph.D. Candidate, Department of English Walter J. Ong Collection Archivist, Pius XII Memorial Library Saint Louis University
Dear Dan and other Digital Medievalists, The Medieval Academy's Committee on Electronic Resources is concerned by this very issue. We feel that at the minimum graduate students need an introduction to humanities computing. We would like to promote study at an even deeper level through course concentration at the Master's level. We recognize the importance of computing skills as a basic research method. We are currently working with CARA (Committee on Centers and Regional Associations) in order to survey their membership to see what the current state of humanities computing curriculum in medieval studies programs is. We hope to create a directory highlighting those programs that have some computing program and the nature of those programs. We also hope to make recommendations based upon this survey to effect its inclusion in medievla studies curriculum. thanks, Patti
Patricia Kosco Cossard, M.A., M.L.S. until Sept. 1, 2005 Resident Fellow at the Maryland Institute for Technolgoy in the Humanities 405-8506 Subject Librarian for Architecture and Historic Preservation University of Maryland Libraries College Park, MD 20742 (301) 405-6316 office (301) 314-9583 fax pcossard@umd.edu
Dan O'Donnell wrote:
On a number of occasions recently, I have been struck by the basic ignorance of many professional medievalists concerning developments in computing that affect their daily lives.
By this I don't mean the latest revisions of TEI P5 and how this will affect the markup of unclear textual variants. Rather I mean much more basic issues such as what XML is and how it is different from HTML or Word; or the relationship between fonts and characters.
This has been striking to me because it is affecting basic aspects of these medievalists' work. You don't have to be a 'computer person' to benefit from Unicode, for example, and I have been surprised how many often quite complicated projects seem to be completely ignorant of fundamental aspects of humanities computing that would greatly simplify their work.
So my question is this: should there be a basic humanities computing course required of beginning graduate students similar to the research methods course so many universities require or recommend? What should such a course contain? Looking around on the internet, I tend to see two types: very practical courses that concentrate on using current software and techniques (e.g. Macromedia, Excel, Word, etc.) and quite advanced courses that focus on the trivia of building standards compliant editions. I'm wondering if there perhaps is not a need for a more conceptual approach: what is Unicode? what is XML? structural vs. display markup? how the web works. database design.
I realise these are not topics that speak to many graduate students in our field--who are, after all presumably interested in their disciplinary subject. But given how many projects now include an electronic component, it seems to me that we may be rapidly approaching the time when a medievalist should be expected to have a basic conceptual grasp of contemporary humanities computing.
What do others think? I'm thinking of asking the same question on Medtext and, phrased more generally, on Humanist. But before I do, I thought I'd ask here. I suspect the issue is more important for medievalists and classicists than it is for scholars in more modern periods: more of us seem to work more closely with things that can be digitised. Or is that discipline prejudice?
-d
-- Daniel Paul O'Donnell Director, Digital Medievalist Project (http://www.digitalmedievalist.org/) Associate Professor of English University of Lethbridge Lethbridge AB T1K 3M4 Tel. +1 (403) 329-2377 Fax. +1 (403) 382-7191
(@webmail)
Digital Medievalist Project Homepage: http://www.digitalmedievalist.org Journal (Spring 2005-): http://www.digitalmedievalist.org/journal.cfm RSS (announcements) server: http://www.digitalmedievalist.org/rss/rss2.cfm Wiki: http://sql.uleth.ca/dmorgwiki/index.php Change membership options: http://listserv.uleth.ca/mailman/listinfo/dm-l Submit RSS announcement: http://www.digitalmedievalist.org/newitem.cfm Contact editorial Board: digitalmedievalist@uleth.ca dm-l mailing list dm-l@uleth.ca http://listserv.uleth.ca/mailman/listinfo/dm-l
There is a gap in the humanities it seems, because not enough students understand technology in general. I remember in my undergraduate years that humanities computing was just something that the 'techies' in the department knew. There wasn't a formal course, except an advanced rhetoric and communication course, which I am grateful I took. However, my department (department of English at Washington State University) just received new MACs so they wanted to show them off.
Anyway, now that I am at the University of Manchester I see that there is an enormous gap that needs to be filled. Some of the students here cannot even copy a file correctly or do simple tasks in Word. We do have some courses that enable some students to understand some basic concepts, but if you ask them what HTML is or how to write a web page there will be question marks written all over their faces.
Therefore, we need to differentiate what we mean by humanities computing for those in the US and Canada, and those the UK; believe me, it's like night and day. In graduate programmes in the UK there needs to be more emphasis on 'required', rather than 'optional', courses on general computer usage, before they move on to other, more advanced skills and technologies.
The emphasis in most humanities faculties in the UK seems to be focused on interdisciplinary work. For us, that means working in history, English, archaeology, etc. However, I do not see in the near future faculties spending too much money on requiring students to become more computer literate - they will leave it to the students to find a way. On the other hand, I am glad to see that institutions in the US and Canada are at least opening the eyes of faculties and students to a necessary field.
Sorry for all the babble!
Abdullah Alger
Quoting Dan O'Donnell daniel.odonnell@uleth.ca:
On a number of occasions recently, I have been struck by the basic ignorance of many professional medievalists concerning developments in computing that affect their daily lives.
By this I don't mean the latest revisions of TEI P5 and how this will affect the markup of unclear textual variants. Rather I mean much more basic issues such as what XML is and how it is different from HTML or Word; or the relationship between fonts and characters.
This has been striking to me because it is affecting basic aspects of these medievalists' work. You don't have to be a 'computer person' to benefit from Unicode, for example, and I have been surprised how many often quite complicated projects seem to be completely ignorant of fundamental aspects of humanities computing that would greatly simplify their work.
So my question is this: should there be a basic humanities computing course required of beginning graduate students similar to the research methods course so many universities require or recommend? What should such a course contain? Looking around on the internet, I tend to see two types: very practical courses that concentrate on using current software and techniques (e.g. Macromedia, Excel, Word, etc.) and quite advanced courses that focus on the trivia of building standards compliant editions. I'm wondering if there perhaps is not a need for a more conceptual approach: what is Unicode? what is XML? structural vs. display markup? how the web works. database design.
I realise these are not topics that speak to many graduate students in our field--who are, after all presumably interested in their disciplinary subject. But given how many projects now include an electronic component, it seems to me that we may be rapidly approaching the time when a medievalist should be expected to have a basic conceptual grasp of contemporary humanities computing.
What do others think? I'm thinking of asking the same question on Medtext and, phrased more generally, on Humanist. But before I do, I thought I'd ask here. I suspect the issue is more important for medievalists and classicists than it is for scholars in more modern periods: more of us seem to work more closely with things that can be digitised. Or is that discipline prejudice?
-d
-- Daniel Paul O'Donnell Director, Digital Medievalist Project (http://www.digitalmedievalist.org/) Associate Professor of English University of Lethbridge Lethbridge AB T1K 3M4 Tel. +1 (403) 329-2377 Fax. +1 (403) 382-7191
(@webmail)
Digital Medievalist Project Homepage: http://www.digitalmedievalist.org Journal (Spring 2005-): http://www.digitalmedievalist.org/journal.cfm RSS (announcements) server: http://www.digitalmedievalist.org/rss/rss2.cfm Wiki: http://sql.uleth.ca/dmorgwiki/index.php Change membership options: http://listserv.uleth.ca/mailman/listinfo/dm-l Submit RSS announcement: http://www.digitalmedievalist.org/newitem.cfm Contact editorial Board: digitalmedievalist@uleth.ca dm-l mailing list dm-l@uleth.ca http://listserv.uleth.ca/mailman/listinfo/dm-l
Abdullah Alger wrote:
Anyway, now that I am at the University of Manchester I see that there is an enormous gap that needs to be filled. Some of the students here cannot even copy a file correctly or do simple tasks in Word. We do have some courses that enable some students to understand some basic concepts, but if you ask them what HTML is or how to write a web page there will be question marks written all over their faces.
Is it really the job of humanities departments to provide this education? These tasks are basic computer literacy which has little to do with 'Humanities Computing' per se. Almost every university in the UK has some provision for IT training should students require it. Even at the University of Manchester where you are students can take courses via the IT Services: http://www.itservices.manchester.ac.uk/trainingcourses/
Surely what you should be doing on any course which requires basic IT skills (or heaven forfend more advanced skills!) is stating this as a prerequisite and directing students to the IT Services to receive training in these skills.
If teaching a course (say on use of XML for producing critical editions or general text encoding) I don't think that we should feel any compunction in listing basic familiarity with HTML as a prerequisite, as long as students have access to training in that area. (I'm not suggesting you say they must have taken specific course X, simply that they are aware that it is a prerequisite and so they should have learnt it somewhere.)
It just seems to be an inefficient waste of resources for us to duplicate training that is already available to most students. Great, offer topics which use a higher level of technology to drive the learning process, but where it isn't the end in itself (teaching palaeography on computer is a good example of this). If increasingly all the interesting courses list 'At least basic computer literacy' as a prerequisite then students will start to make sure they have these skills.
But of course, I was using mainframes at the age of 8, so my perspective may vary from yours. ;-)
-James