in short, yes. MSIE is the regular target of complaints from the web standards community.
I second that.
perhaps the response among this group should be the converse of many
web developers: develop standards-based texts and applications and require your users to use standards-based browsers.
Succesfullness of a webtool/application is often decided by whether it's well usable in the most popular browser (which too bad still is IE). If it's not, users will blame the application, not the browser. So I wouldn't recommend not making your application IE compatible. One should do both: implement the standards and provide a IE compatible version as well.
even if IE is parsing XSL better right now, other browsers, particularly mozilla-based browsers, are sure to catch up.
Although I doubt whether IE parses better, I wonder why one wouldn't just parse server side and sending HTML? For those who want the very XML, a download button, FTP, or OAI solution may be provided. HTML is implemented fairly reliable over all browsers, is it? Or do I fail to see an important point for not parsing XML server side?
grtz, Joris
Hi all,
Joris van.Zundert wrote:
Although I doubt whether IE parses better, I wonder why one wouldn't just parse server side and sending HTML? For those who want the very XML, a download button, FTP, or OAI solution may be provided. HTML is implemented fairly reliable over all browsers, is it? Or do I fail to see an important point for not parsing XML server side?
There are three good reasons:
-You have no XSL transformation tool on the server (and no control over the server so you can't install one).
-You need to deliver materials on CD-ROM or other standalone media for people with no access to the Internet.
-Your server is struggling with its load, and you'd like to offload some of the work to clients.
Cheers, Martin
On Monday, June 28, 2004, at 05:54 AM, Joris van.Zundert wrote:
perhaps the response among this group should be the converse of many web developers: develop standards-based texts and applications and require your users to use standards-based browsers.
Succesfullness of a webtool/application is often decided by whether it's well usable in the most popular browser (which too bad still is IE). If it's not, users will blame the application, not the browser. So I wouldn't recommend not making your application IE compatible. One should do both: implement the standards and provide a IE compatible version as well.
in principle, i agree entirely. cross-browser compatibility is the preferred option. i suppose my response here is partly an expression of frustration that many developers develop in such a way as to entirely shut out non-IE browsers (and this is usually tied precisely to the use of microsoft server software, IIS, and its functionality).
I do think, however, that there are a couple of more substantive points. first, you have to consider how much extra work it is to develop a whole bunch of work-around, non-standard material in addition to your standards-compliant version. some of this could probably be mitigated by, for example, server-side parsing, but not all of it. worse, with non-standards-compliant browsers, you never know where they're going next, and what work-around might not work with the next release -- *especially* if their objective is to pull people away from standards-compliant browsers, as microsoft's seems to be.
you also need to consider your specific audience. it may well be that you have a lot more opera/mozilla/safari users than the wider population. it may also be that with a more relatively specialist audience, you can expect them to take the small extra step of downloading a free standards-compliant browser, if they don't have one already. they may even find they like it better than IE. since the applications we're talking about are not mass market applications or web sites, you can probably afford to set the bar of entry a teeny bit higher than if you're running people off to another online bookstore or credit card company or service provider because your site doesn't work in IE.
even if IE is parsing XSL better right now, other browsers, particularly mozilla-based browsers, are sure to catch up.
Although I doubt whether IE parses better, I wonder why one wouldn't just parse server side and sending HTML? For those who want the very XML, a download button, FTP, or OAI solution may be provided. HTML is implemented fairly reliable over all browsers, is it? Or do I fail to see an important point for not parsing XML server side?
well, i would say here that you avoid adding burden to the server as a general rule, much like cross-browser compatibility is a general rule. that doesn't mean that server-side parsing is a bad solution to every problem, just that it can't be the solution to *every* problem, particularly when you ought to be able to expect the browser to do the work. that said, when you *can* do server-side parsing, it's the kind of "work-around" that at least won't mean lots of recoding (at least compared to what we're talking about above). when browsers come up to snuff, you just migrate the parsing work to the browser. that's actually making less work down the road, rather than more. (i see martin has snuck in a nice response to this, just now.)
cheers, j
I suspect that my own experiences have made me a bit jaundiced about client-side programming generally. I spent years making and maintaining a Java applet that would work both in Netscape and IE. It was an unspeakable misery: every time I got it stable Sun would change the spec in response to some new security concern, or Microsoft would break some new thing in their Java "implementation," and I'd have to figure out what was wrong and fix it, often while my students or someone else's were complaining loudly about lack of access.
So now I'm in the process of migrating as much as possible to the server side, and my new rules are:
- never to do anything on the client side that can be done on the server.
- Use only standards-compliant features of browsers (in my case CSS and a bit of [mostly] ECMA-compliant JavaScript).
So far this has made for a (relatively) carefree computing experience. No complaints from users about how this or that won't work; no need to serve different pages or programs for different browsers, etc. I won't say that it's nuts to use the XSL built into Microsoft's browser, but I absolutely would not do it, since I've got a perfectly good XSLT processor available on the server side; and I just have to worry about how that one works, as opposed to three or four different ones.
Peter
Jeffrey Fisher wrote:
On Monday, June 28, 2004, at 05:54 AM, Joris van.Zundert wrote:
perhaps the response among this group should be the converse of many web developers: develop standards-based texts and applications and require your users to use standards-based browsers.
Succesfullness of a webtool/application is often decided by whether it's well usable in the most popular browser (which too bad still is IE). If it's not, users will blame the application, not the browser. So I wouldn't recommend not making your application IE compatible. One should do both: implement the standards and provide a IE compatible version as well.
in principle, i agree entirely. cross-browser compatibility is the preferred option. i suppose my response here is partly an expression of frustration that many developers develop in such a way as to entirely shut out non-IE browsers (and this is usually tied precisely to the use of microsoft server software, IIS, and its functionality).
I do think, however, that there are a couple of more substantive points. first, you have to consider how much extra work it is to develop a whole bunch of work-around, non-standard material in addition to your standards-compliant version. some of this could probably be mitigated by, for example, server-side parsing, but not all of it. worse, with non-standards-compliant browsers, you never know where they're going next, and what work-around might not work with the next release -- *especially* if their objective is to pull people away from standards-compliant browsers, as microsoft's seems to be.
you also need to consider your specific audience. it may well be that you have a lot more opera/mozilla/safari users than the wider population. it may also be that with a more relatively specialist audience, you can expect them to take the small extra step of downloading a free standards-compliant browser, if they don't have one already. they may even find they like it better than IE. since the applications we're talking about are not mass market applications or web sites, you can probably afford to set the bar of entry a teeny bit higher than if you're running people off to another online bookstore or credit card company or service provider because your site doesn't work in IE.
even if IE is parsing XSL better right now, other browsers, particularly mozilla-based browsers, are sure to catch up.
Although I doubt whether IE parses better, I wonder why one wouldn't just parse server side and sending HTML? For those who want the very XML, a download button, FTP, or OAI solution may be provided. HTML is implemented fairly reliable over all browsers, is it? Or do I fail to see an important point for not parsing XML server side?
well, i would say here that you avoid adding burden to the server as a general rule, much like cross-browser compatibility is a general rule. that doesn't mean that server-side parsing is a bad solution to every problem, just that it can't be the solution to *every* problem, particularly when you ought to be able to expect the browser to do the work. that said, when you *can* do server-side parsing, it's the kind of "work-around" that at least won't mean lots of recoding (at least compared to what we're talking about above). when browsers come up to snuff, you just migrate the parsing work to the browser. that's actually making less work down the road, rather than more. (i see martin has snuck in a nice response to this, just now.)
cheers, j
dm-l mailing list dm-l@uleth.ca http://listserv.uleth.ca/mailman/listinfo/dm-l
On Monday, June 28, 2004, at 08:16 AM, Peter Baker wrote:
I suspect that my own experiences have made me a bit jaundiced about client-side programming generally. I spent years making and maintaining a Java applet that would work both in Netscape and IE. It was an unspeakable misery: every time I got it stable Sun would change the spec in response to some new security concern, or Microsoft would break some new thing in their Java "implementation," and I'd have to figure out what was wrong and fix it, often while my students or someone else's were complaining loudly about lack of access.
java is a particularly egregious example of incompatible implementations of what was meant to be an entirely platform-independent "standard". very very frustrating, in principle and in practice. "java" never really meant "java".
So now I'm in the process of migrating as much as possible to the server side, and my new rules are:
- never to do anything on the client side that can be done on the
server.
i don't think i would disagree with this. i would only say that the key word is "can". it includes the server resources to handle the load, and in such a way that it doesn't affect end-user experience (ie, slow loading or server overload errors). to be honest, my argument about a smaller niche audience with respect to browsers may apply here, as long as your institution has the server resources and bandwidth to give you.
- Use only standards-compliant features of browsers (in my case CSS
and a bit of [mostly] ECMA-compliant JavaScript).
exactly. and so this is *still* an argument for staying with standards and not relying on browser-dependent work-arounds.
cheers, j
This is an interesting thread in as much as it touches on best practice. Leaving aside the server vs. client side serving of xml for a minute (do most people still not convert xml to (x)html and serve static pages?), surely it isn't that hard to produce pages that work on both Internet Explorer and proper browsers? I've always followed what I've thought as Peter Baker's philosophy (perhaps he can correct me if I'm wrong) of adopting "works well on standards based browsers, is neither broken nor egregiously ugly on Internet Explorer" in my design. The place where I am most aware of this is with floating menus (as in the dm site <www.digitalmedievalist.org>). These stay put in standards-based browsers, but scroll in MSIE. If and when IE does these properly, there are no work arounds to content with: the menu will float for IE users as well.
I am a little more concerned about the need for some work-arounds, mostly because it produces bad coding that won't age gracefully when IE gets with the program. One place is, for example, the min- and max-width instructions in CSS. I use a workaround that takes advantage of the fact that MSIE doesn't understand child rules expressed using a "greater than sign" to turn off styles MSIE specific styles. I suppose this is harmless, though it is annoying.
I have been thinking lately of offering users multiple stylesheets to allow them to customise pages to suit their equipement, interests, and/or disabilities. Thus users with larger screens might like to see more options on a menu; or visually impaired users might prefer their links to be underlined. Of course, once again, most standards based browsers have a style sheet switcher option built in, which means MSIE doesn't. There are some simple solutions however. If you'd like to see one in action, the CSSZenGarden is a great site http://www.csszengarden.com/. They have a link that explains how they switch the stylesheets.
If you are new to CSS, standards-based design, and this MSIE implementation problem, csszengarden is also a good site to visit. It shows you the power of predictable coding. -dan Jeffrey Fisher wrote:
On Monday, June 28, 2004, at 08:16 AM, Peter Baker wrote:
I suspect that my own experiences have made me a bit jaundiced about client-side programming generally. I spent years making and maintaining a Java applet that would work both in Netscape and IE. It was an unspeakable misery: every time I got it stable Sun would change the spec in response to some new security concern, or Microsoft would break some new thing in their Java "implementation," and I'd have to figure out what was wrong and fix it, often while my students or someone else's were complaining loudly about lack of access.
java is a particularly egregious example of incompatible implementations of what was meant to be an entirely platform-independent "standard". very very frustrating, in principle and in practice. "java" never really meant "java".
So now I'm in the process of migrating as much as possible to the server side, and my new rules are:
- never to do anything on the client side that can be done on the
server.
i don't think i would disagree with this. i would only say that the key word is "can". it includes the server resources to handle the load, and in such a way that it doesn't affect end-user experience (ie, slow loading or server overload errors). to be honest, my argument about a smaller niche audience with respect to browsers may apply here, as long as your institution has the server resources and bandwidth to give you.
- Use only standards-compliant features of browsers (in my case CSS
and a bit of [mostly] ECMA-compliant JavaScript).
exactly. and so this is *still* an argument for staying with standards and not relying on browser-dependent work-arounds.
cheers, j
dm-l mailing list dm-l@uleth.ca http://listserv.uleth.ca/mailman/listinfo/dm-l
Hi there,
At 06:48 AM 28/06/2004, you wrote:
Leaving aside the server vs. client side serving of xml for a minute (do most people still not convert xml to (x)html and serve static pages?), surely it isn't that hard to produce pages that work on both Internet Explorer and proper browsers?
For the last year or two, it's been very practical do to this. The output from the current generation of our Hot Potatoes authoring tools is standards-compliant XHTML 1.1, CSS and JavaScript, and these highly complex and interactive pages work on IE6, Mozilla/Firefox, Opera 7.5+ and Safari. The ONLY browser-sniffing code we use these days concerns areas where the standards are not clear (such as how to calculate the size of the browser window viewport or document area), and this is abstracted into a couple of common JavaScript functions. In previous versions of our software, up to one third of the code in the pages was concerned with browser-sniffing and branching to handle relatively simple functionality such as drag-and-drop. We no longer need to waste all that work.
There's no doubt in my mind that the current generation of browsers is the most broadly standards-compliant set we have ever been able to write for, and life for Web developers has consequently never been better; we have a pretty rich feature set, a working DOM implementation with good APIs, and stable long-term standards to write to. I'd love to have everything CSS3 promises (or even the last few bits of CSS2 that still aren't supported), but what we have is enough to create rich, interactive documents and Websites which fully validate and will not break in the future. There's no need for the misery of Java applets either, unless you need to save and load files locally or something like that.
One shameful admission -- in our JavaScript DOM code we do use "innerHTML", a Microsoft-spawned property which is not part of the W3C DOM API, but which is supported by all browsers, and which, to be frank, is a pretty good idea that saves a lot of code. If it didn't exist, we'd have to write a similar function ourselves (not hard, but time-consuming and surely slower than the native browser implementations).
My 2 cents, Martin
______________________________________ Martin Holmes University of Victoria Humanities Computing and Media Centre mholmes@uvic.ca martin@mholmes.com mholmes@halfbakedsoftware.com http://www.mholmes.com http://web.uvic.ca/hcmc/ http://www.halfbakedsoftware.com
On Monday, June 28, 2004, at 10:51 AM, Martin Holmes wrote:
For the last year or two, it's been very practical do to this. The output from the current generation of our Hot Potatoes authoring tools is standards-compliant XHTML 1.1, CSS and JavaScript, and these highly complex and interactive pages work on IE6, Mozilla/Firefox, Opera 7.5+ and Safari. The ONLY browser-sniffing code we use these days concerns areas where the standards are not clear (such as how to calculate the size of the browser window viewport or document area), and this is abstracted into a couple of common JavaScript functions. In previous versions of our software, up to one third of the code in the pages was concerned with browser-sniffing and branching to handle relatively simple functionality such as drag-and-drop. We no longer need to waste all that work.
<snipping more interesting stuff>
martin,
this is very interesting to me, in general and in terms of work i'm doing right now. i admit i've always been surprised to hear developers (i'm really a designer) complain so much about cross-platform coding, especially as it appears to have gotten easier, recently. i wonder if you have one or two examples of hot potatoes in action that we might be able to take a look at?
j
Hi there,
At 09:13 AM 28/06/2004, you wrote:
i wonder if you have one or two examples of hot potatoes in action that we might be able to take a look at?
There's a set of sample exercises from the tutorial for the programs here:
http://web.uvic.ca/hrd/hotpot/wintutor6/index.htm
These include multiple-choice, short-answer, gapfill, matching, crosswords and jumbled sentences, and there's a fair amount of interactivity in terms of showing feedback on user input, offering hints, scoring and so on. Another of our apps, Quandary, produces action mazes (branching decision trees) in a similar format:
http://www.halfbakedsoftware.com/quandary/version_2/examples/
You can View Source to see any of the JavaScript code; you'll see large blocks of JavaScript that's basically used to manipulate the page DOM to hide and show questions and answers, show responses, modify parts of the page, and so on.
Cheers, Martin
______________________________________ Martin Holmes University of Victoria Humanities Computing and Media Centre mholmes@uvic.ca martin@mholmes.com mholmes@halfbakedsoftware.com http://www.mholmes.com http://web.uvic.ca/hcmc/ http://www.halfbakedsoftware.com
On Monday, June 28, 2004, at 12:01 PM, Martin Holmes wrote:
Hi there,
At 09:13 AM 28/06/2004, you wrote:
i wonder if you have one or two examples of hot potatoes in action that we might be able to take a look at?
There's a set of sample exercises from the tutorial for the programs here:
http://web.uvic.ca/hrd/hotpot/wintutor6/index.htm
These include multiple-choice, short-answer, gapfill, matching, crosswords and jumbled sentences, and there's a fair amount of interactivity in terms of showing feedback on user input, offering hints, scoring and so on. Another of our apps, Quandary, produces action mazes (branching decision trees) in a similar format:
http://www.halfbakedsoftware.com/quandary/version_2/examples/
thanks, martin.
incidentally, the problems i was complaining about several posts back in terms of having to use IE in my current side-work have been fixed with mozilla 1.7. this had to do with NT-server-based login. mozilla 1.7 has apparently resolved these issues, and i can now log in successfully using mozilla and firefox or camino (both of which rely on mozilla 1.7), as well as IE. i don't know about the recent safari, since i'm still on X 10.2, not panther. my colleague found it interesting, as well. just a reminder that some of this is browser and some is platform . . .
cheers,
j
Hi there,
At 11:07 AM 28/06/2004, you wrote:
incidentally, the problems i was complaining about several posts back in terms of having to use IE in my current side-work have been fixed with mozilla 1.7. this had to do with NT-server-based login. mozilla 1.7 has apparently resolved these issues, and i can now log in successfully using mozilla and firefox or camino (both of which rely on mozilla 1.7), as well as IE. i don't know about the recent safari, since i'm still on X 10.2, not panther. my colleague found it interesting, as well. just a reminder that some of this is browser and some is platform . . .
It is nice that NTLM authentication is now built into Mozilla (although it isn't in Safari yet). This authentication is used for access to Windows servers on the network which are using Windows authentication, so it's really a case of the Mozilla guys being pragmatic and offering this convenience to users who regularly have to operate in Windows environments. Presumably Apple will eventually do the same for Safari, given the effort they've already put into making OSX work well in Windows-dominated environments.
Cheers, Martin
______________________________________ Martin Holmes University of Victoria Humanities Computing and Media Centre mholmes@uvic.ca martin@mholmes.com mholmes@halfbakedsoftware.com http://www.mholmes.com http://web.uvic.ca/hcmc/ http://www.halfbakedsoftware.com
On Monday, June 28, 2004, at 01:28 PM, Martin Holmes wrote:
Hi there,
At 11:07 AM 28/06/2004, you wrote:
incidentally, the problems i was complaining about several posts back in terms of having to use IE in my current side-work have been fixed with mozilla 1.7. this had to do with NT-server-based login. mozilla 1.7 has apparently resolved these issues, and i can now log in successfully using mozilla and firefox or camino (both of which rely on mozilla 1.7), as well as IE. i don't know about the recent safari, since i'm still on X 10.2, not panther. my colleague found it interesting, as well. just a reminder that some of this is browser and some is platform . . .
It is nice that NTLM authentication is now built into Mozilla (although it isn't in Safari yet). This authentication is used for access to Windows servers on the network which are using Windows authentication, so it's really a case of the Mozilla guys being pragmatic and offering this convenience to users who regularly have to operate in Windows environments. Presumably Apple will eventually do the same for Safari, given the effort they've already put into making OSX work well in Windows-dominated environments.
despite my antipathy for microsoft, its general attitude to the universe and especially to users, i have to agree with you entirely, here. i've had no trouble at all on NT networks since installing X. i could even take my ibook from network to network with no reboots and no crashes. of course, apple has always been better at playing nice with windows than the other way around, but that, as you say, is the nature of the beast.
j
That's nice, Martin --
reminds me of a Cloze engine I made once in the olden PLATO days -- you typed in your passage and then could ask for any distance between blanks, at what point in the text the Cloze stuff began, whether or not hints were given, and in what form, scored the thing and did some statistics with the class responses. And the whole thing interacted with stats and scores from a "guess the next letter" thing that used Shannon-Weaver stuff to plot the chances of the next letter being guessed, and with another exercise that used Osgood semantic diferentials,and I plotted the whole thing's scores in polar and in regular graphs...Sigh, them were the very old days. (And no one gave me any funds to do it - in fact,the administration was annoyed to see a Humanities faculty member messing about with computers)
HI there,
At 11:23 AM 28/06/2004, you wrote:
Content-Transfer-Encoding: 7bit
That's nice, Martin --
reminds me of a Cloze engine I made once in the olden PLATO days -- you typed in your passage and then could ask for any distance between blanks, at what point in the text the Cloze stuff began, whether or not hints were given, and in what form, scored the thing and did some statistics with the class responses. And the whole thing interacted with stats and scores from a "guess the next letter" thing that used Shannon-Weaver stuff to plot the chances of the next letter being guessed, and with another exercise that used Osgood semantic diferentials,and I plotted the whole thing's scores in polar and in regular graphs...Sigh, them were the very old days. (And no one gave me any funds to do it - in fact,the administration was annoyed to see a Humanities faculty member messing about with computers)
It's all in the timing, eh? We're something called the Humanities Computing and Media Centre, and we actually spun off a company that makes money out of this stuff and feeds some of it back into our unit to support other projects. Times change!
Our cloze stuff is of course nothing like as sophisticated as what you describe; it's aimed at regular instructors who want to make simple exercises easily and post them on a Web server without having to worry about anything being installed on the server, so they work client-side.
Cheers, Martin
Martin Holmes University of Victoria Humanities Computing and Media Centre mholmes@uvic.ca martin@mholmes.com mholmes@halfbakedsoftware.com http://www.mholmes.com http://web.uvic.ca/hcmc/ http://www.halfbakedsoftware.com
Well, I used to try to do at least one project every summer, since I didn't like teaching summer school (does anyone?), and I could work at my own pace, being unfunded. But I really like the looks of yours, and the serviceability.
Our cloze stuff is of course nothing like as sophisticated as what you
describe; it's aimed at regular instructors who want to make simple exercises easily and post them on a Web server without having to worry about anything being installed on the server, so they work client-side.
All --
On 28 Jun 2004, at 8:48 AM, Daniel O'Donnell wrote:
This is an interesting thread in as much as it touches on best practice... ...I have been thinking lately of offering users multiple stylesheets to allow them to customise pages to suit their equipement, interests, and/or disabilities. Thus users with larger screens might like to see more options on a menu; or visually impaired users might prefer their links to be underlined.
This touches on a key point -- when we develop applications that are open to the world, we don't know who will be using them, in what context, and at what time. Web browsers extend far beyond the mainstream MSIE/Mozilla/Opera world, users can be from anywhere and from any background (look at me, for example), and projects may live on for years after the development stops.
I know this is a hard thing for a few IT people (and some major corporations) to swallow, but we can't expect the world to conform to what we've got on our desks anymore (I wish we could, it would make my job much easier). I've had a number of faculty and students ask about accessing Web-based materials and applications on mobile phones, PDAs, and embedded systems in television set-top boxes, and through screen readers for the visually impaired. Browsers seem to be embedded into nearly everything these days, and who knows what will be next? With that in mind, should we develop academic tools from a restricted perspective of only one browser on one operating system? Before there were robust standards, it was difficult to argue otherwise. Today, it is difficult to support an argument for browser-specific development when there are standards to address much of what required platform- or browser-specific functionality (and I know funding agencies and governments are starting to pay attention to that fact).
There are trade-offs and I admit that standards, especially new ones, can get in the way of functionality. The important thing to remember, however, is that by converging toward standards, materials and applications will stand a greater chance of reaching a broader audience over a longer period of time with, hopefully, lower support costs and a better chance of being supported in the future.
Personally, I want to see as many scholarly projects survive over time as possible. By adopting mature standards when appropriate, the initial overhead in implementing them will pay dividends in the long-term. In other words, someone in my role can find resources to keep them going long after the funding stops and the development ends.
Later, Chad -------------- Chad J. Kainz cjkainz@uchicago.edu | 773-702-9945 | FAX 773-834-2983
Senior Director, NSIT Academic Technologies, The University of Chicago http://intech.uchicago.edu Vice-Chair, IEEE Learning Technology Standards Committee http://ltsc.ieee.org
I've been following this discussion on browser standards with interest, but one thought keeps nagging at me.
One of the mentioned benefits is this browser or that browser's XSLT parser and its standards compliance. I think that we all agree the standard compliance and also the promotion of open standards are good things. The little nagging thought comes with this discussion of XML transformation in the browser. Who actually requires this of their users? Many users have extremely out of date browsers which will display HTML with (sometimes if you are lucky) a bit of CSS. Many don't have browsers that have XSLT parsers built in. So it really isn't an issue, to me, because any site I design will always try to serve (X)HTML as a bare minimum.[1] The transformations of XML to HTML via XSLT may be done on the fly, but they certainly won't be done in the user's browser. I don't trust users to have any particular browser. So if the transformations (with my preference with apache's cocoon) are done on the server, then the XSLT parser of the browser doesn't really matter to me that much. Now, it is a shame that all browsers don't follow the same standard in the same way (esp. with regard to CSS), but I don't see that there is much I can do about that, only follow the spec. and hope the browsers finally sort things out.[2]
-James
[1] I hasten to add that I didn't design the Oxford Text Archive's website, and that we will be redesigning it to be entirely open standards compliant, once we have finished totally reorganising the back end and our matching workflow. [2] Ok, I use Firefox, and yes I could donate time and energy to the mozilla project to help it, but I don't really have any spare reserves of either.
--- Dr James Cummings, Oxford Text Archive, University of Oxford James dot Cummings at ota dot ahds dot ac dot uk
On Monday, June 28, 2004, at 09:06 AM, James Cummings wrote:
Many users have extremely out of date browsers which will display HTML with (sometimes if you are lucky) a bit of CSS. Many don't have browsers that have XSLT parsers built in.
fwiw, here are the latest browser use numbers.
http://www.w3schools.com/browsers/browsers_stats.asp
what i find most interesting is that mozilla seems to be gaining as users switch from IE 5. that is, they *appear* to be going from IE5 not to IE6 but to mozilla.
all this said, i would still guess people on this list will have a higher percentage of IE5 and non-IE browsers with your users, in part because you probably have a larger mac audience than the general population. you probably also have more users with the lower resolution. these are obviously just guesses, though.
cheers, j
On Mon, 28 Jun 2004, Jeffrey Fisher wrote:
all this said, i would still guess people on this list will have a higher percentage of IE5 and non-IE browsers with your users, in part because you probably have a larger mac audience than the general population. you probably also have more users with the lower resolution. these are obviously just guesses, though.
Probably. Should we be building sites to the browsers/users or just build them to the standards and wait for browers/users to catch up? Obviously while I'd like to do the latter, in practice I do the former. I do try to avoid the browser-specific hacks that were mentioned earlier however (for the already mentioned fear of their need for migration). This is common in CSS to produce hacks that display the same in MSIE and other browsers. Better, in my opinion, to only use CSS that most modern browsers understand.
-James
--- Dr James Cummings, Oxford Text Archive, University of Oxford James dot Cummings at ota dot ahds dot ac dot uk
On Monday, June 28, 2004, at 09:32 AM, James Cummings wrote:
On Mon, 28 Jun 2004, Jeffrey Fisher wrote:
all this said, i would still guess people on this list will have a higher percentage of IE5 and non-IE browsers with your users, in part because you probably have a larger mac audience than the general population. you probably also have more users with the lower resolution. these are obviously just guesses, though.
Probably. Should we be building sites to the browsers/users or just build them to the standards and wait for browers/users to catch up? Obviously while I'd like to do the latter, in practice I do the former. I do try to avoid the browser-specific hacks that were mentioned earlier however (for the already mentioned fear of their need for migration). This is common in CSS to produce hacks that display the same in MSIE and other browsers. Better, in my opinion, to only use CSS that most modern browsers understand.
agreed on all counts. my argument against designing for IE (and designing instead to standards) is really an argument against browser-specific design that is or amounts to a hack. i would include using IE-specific functionality or ignoring standards to make something work for IE in the latter category. the problem is that IE is the dominant browser, so you have to make decisions about what's more important in the long run.
the thing is that with mozilla and opera you at least have browsers that are committed to implementing the standards eventually, even if they don't do so (or do so very well), yet. microsoft is another kettle of fish.
anyhow, i don't want to continue to push this, since i think there's probably very little practical disagreement. i do think it's the kind of thing that we all need to keep in mind as we make design decisions now that we we will have to live with (or fix) down the road. that's really what it amounts to.
cheers,
j
Is there also not another very major point? If we go too far down any proprietary line, we risk contravening accessibility guidelines/legislation. Standards-compliance is surely a central aspect of accessible design.
Tom Chadwin (list newcomer: hello everyone!)
On Monday, June 28, 2004, at 09:32 AM, James Cummings wrote:
agreed on all counts. my argument against designing for IE (and designing instead to standards) is really an argument against browser-specific design that is or amounts to a hack. i would include using IE-specific functionality or ignoring standards to make something work for IE in the latter category. the problem is that IE is the dominant browser, so you have to make decisions about what's more important in the long run.
the thing is that with mozilla and opera you at least have browsers that are committed to implementing the standards eventually, even if they don't do so (or do so very well), yet. microsoft is another kettle of fish.
anyhow, i don't want to continue to push this, since i think there's probably very little practical disagreement. i do think it's the kind of thing that we all need to keep in mind as we make design decisions now that we we will have to live with (or fix) down the road. that's really what it amounts to.
cheers,
j
Hi there,
At 07:06 AM 28/06/2004, you wrote:
I've been following this discussion on browser standards with interest, but one thought keeps nagging at me.
One of the mentioned benefits is this browser or that browser's XSLT parser and its standards compliance. I think that we all agree the standard compliance and also the promotion of open standards are good things. The little nagging thought comes with this discussion of XML transformation in the browser. Who actually requires this of their users?
We have a number of sites created over the past few years that require client-side XSLT processing, because our main UVic Web servers have no XSLT processor. In fact, they have only one CGI program (FormMail), disallow installation of your own PERL scripts, and are only now experimenting with adding PHP.
Many users have extremely out of date browsers which will display HTML with (sometimes if you are lucky) a bit of CSS.
I think "many" is really overstating the case there. Our current crop of software produces output which only works on browsers released in the last three years; we sell steadily to customers all over the world, and the only complaints we've had about browser compatibility in the last six months have come from one US corporation and one US military unit, both of whom said they were not allowed to upgrade from IE5.5 to IE6 by their sysops. The rest of the world is happily moving forward, and with such a wealth of reasonably good browsers available, there's no reason to stick with an old browser (or at least, there's no reason to stick with ONLY an old browser, and avoid adding a new one to your system). The current Firefox download is less than 5 megs, and it runs happily on a Pentium 166 from 1996. Users who avoid updating IE are being a bit silly, because they're leaving open security holes. The only real problem platform is the remaining installed base of Mac OS 9 and below, for which there isn't really a good modern browserand this shrinks monthly and is pretty insignificant outside North America; even there, I think recent versions of Netscape will work.
Many don't have browsers that have XSLT parsers built in. So it really isn't an issue, to me, because any site I design will always try to serve (X)HTML as a bare minimum.[1] The transformations of XML to HTML via XSLT may be done on the fly, but they certainly won't be done in the user's browser. I don't trust users to have any particular browser.
I don't either, but it's reasonable to ask them to have one of several good standards-compliant browsers if you're doing something complex that needs modern features. And I also think that encouraging people to go and get good browsers, by providing them with an incentive in the form of a site that makes good use of modern standards, is an all-around Good Thing.
Cheers, Martin
______________________________________ Martin Holmes University of Victoria Humanities Computing and Media Centre mholmes@uvic.ca martin@mholmes.com mholmes@halfbakedsoftware.com http://www.mholmes.com http://web.uvic.ca/hcmc/ http://www.halfbakedsoftware.com