Bruce Lawson’s personal site

My two new sites with no valid pages

Everybody knows that to be completely accessible and web standardsly lovely your site should

and without that holy trinity, your site isn’t doing its job, right?

Wrong. Or rather, not wrong – but the three above are often not possible in a commercial environment, so you do the best you can.

Last month, two new sites launched for which I was front-end technical lead. Not a single page on either site validated.

The main problem

The main reason for the lack of validation is to get round a really important Internet Explorer bug that can make in-page links inaccessible if they’re identified by an id on an element like a heading, or a div or a paragraph that doesn’t have hasLayout. That is, if you link to <h2 id="privacy">Our privacy policy</h2>, an IE-user navigating with a keyboard can never reach the destination of that link.

I’ve got a simple test page. Try it in a modern non-IE browser and activate the link in the first paragraph. The focus goes to the paragraph that’s the destination of the link, so you’ll see nothing (unless you’re using Firefox, in which case the whole paragraph that is the :target of the link turns a nice pink backing). Hit tab again, and the focus should go to the link in the paragraph, which will go yellow.

But in IE (and, shamefully, this includes IE7), you can never tab to the second link as the focus gets lost. (Further explanation of the bug on Jim Thatcher‘s site.)

Solution

Jim Thatcher’s round-up suggests the following code
<span style="position:absolute;">
<a name="content" id="content">&nbsp;</a>
</span>

but frankly, that seems a bit clunky, especially as the sites I launched were relying on content to be provided ready-marked up by subject area experts in the business rather than the web team.

It was Gez Lemon who provided the answer, which is to give each link destination a tabindex of -1:

<h2 tabindex="-1" id="privacy">Our privacy policy</h2>

I was briefly tempted by the idea of scripting my way out of validation errors, as Gez suggests, but for this content, the destination of links could be any element, rather than a predictable h2 or something, and it feels inefficient to have JavaScript add the tabindex to every paragraph on page, just in case.

Styling away spurious outlines

One unintended side-effect of a negative tabindex is that it makes elements focussable, and they can then receive dotted outlines. I toyed with the idea of leaving these in so the user could see where they’ve landed, but decided eventually that it’s best not to muck about with user expectation, so killed these in modern browsers with the CSS [tabindex="-1"]:focus {outline:none;}

Invalid code? Won’t somebody think of the children?

I’m generally a stickler for valid code, as I believe it helps accessibility, However, as Gez says,

Despite my stance on validity being important for accessibility, this is one of those situations where I believe an invalid attribute with an invalid value would be excusable as there’s an obvious benefit for accessibility.

It’s also a question of philosophy. From my perspective, valid code is worth aspiring to, as it creates a predictable DOM tree, which is good for assistive devices, as well as for scripts that add Ajaxy bollocks. Incorrectly closing tags, nesting elements the wrong way and other markup sins will make your DOM unpredictable. But adding an invalid attribute is unlikely to do so.: I’d be very interested if anyone can show me ill-effects. (Also see my somewhat dated 2004 post CSS/ xhtml: does validation matter?)

Semantics go down the pan, as well

I mentioned that content was being given to me ready-marked up by non-webheads, who all received a styleguide called the “Expert Author Markup Guide” (PDF, 140K) that detailed the elements they were allowed to use. Deliberately, not every element available in html was in there. (The definition list is notably absent, and tables glossed over.)

(You’re invited to criticise the Markup Guide; it accompanies the Web site Constitution.)

Another thing that breaches the Standardista’s Code of Honour is my advice that expert authors add decorative images in the mark-up. This is because it’s folly to commit yourself to adding lines to the stylesheet for every bit of stock photography frippery, and suicidal to ask expert authors to add stuff to the CSS themselves – with a min-height rule and an IE6 pseudo-min-height too.

For a fraction of a second I considered inline css to add decoration – and then saw the error of my ways. In those areas of the site marked-up by expert authors, decorative images are achieved by the good old-fashioned img element, no presentational attributes and blank alternative text.

Swings and roundabouts

I believe that this pragmatic approach has benefits as well as compromises. The negative tabindex has an obvious benefit – it makes content accessible in Internet Explorer.

The cut-down markup delivered by expert authors has an advantage for web editors, who can run save time by quickly reading the code to check semantics, and running submitted content through html tidy to fix incorrectly closed tags.

There’s a more subtle advantage too; whereas previously, expert authors wrote reams of unstyled Word pages and “put it on the web” by pressing the “make PDF” button in Office, now they have to think about structure, levels of heading and the like, so the documents that are published as html benefit from better structure, more thought and fewer words.

29 Responses to “ My two new sites with no valid pages ”

Comment by Wild Ted

Thanks for this – really useful.
Just one point concerning the Markup Guide. Might it be worth mentioning in the Writing link text section that, so that screen readers can still pick up the full link text, class=”hidden” should use a CSS style that places it offscreen rather than, for example, display:none?

Comment by patrick h. lauke

you know, that whole issue with the links in IE not working…call me lucky, but I’ve never had to deal with that sh*t. my links seem to usually work (see for instance the “skip to content” link on http://www.salford.ac.uk). don’t ask me why, but they do…maybe it’s the absolute positioning i’m already doing to get the page scaffolding in place?

Comment by Bruce

Cheers Wild Ted. Yes, the hidden class moves the text off screen. I don’t mention that in the styleguide, as it’s for people unfamiliar with html, so css would be waaaay too scary.

Comment by Rich Pedley

Have you tried …

<div id=”content-wrapper” style=”height:50%;”>

(hoping I’ve got those the right way round.)

I can probably dig up an example page if you need it.

Comment by Bruce

Thanks Rich, I considered that. I rejected it because it relies on the CSS forcing hasLayout, which contravenes the priority 1 WCAG guideline “Organize documents so they may be read without style sheets. For example, when an HTML document is rendered without associated style sheets, it must still be possible to read the document.” http://www.w3.org/TR/WAI-WEBCONTENT/full-checklist.html

My client’s legal eagles advise that they need to adhere to all priority #1 guidelines, and as many priority #2 guidelines practicable. We could easily defend in court the decision to break the priority #2 guideline “Create documents that validate to published formal grammars.”http://www.w3.org/TR/WAI-WEBCONTENT/full-checklist.html in order to make it keyboard accessible for IE users.

Comment by Rich Pedley

good points, but there is only so far I go to pander to the many misgivings of a browser 🙂

I’m not even sure how many people actually browse without CSS turned on, or who those people might me and what their requirements might be. Personally I’d opt for something that worked with CSS on, and leave the others to their own devices. *cough*

Comment by Bruce

That’s my preference, too, Rich – which is why I haven’t got negative tab indexes all over this site.

But the client has a (laudable) wish to be seen to be meeting as many WCAG guidelines as possible, and it makes good sense to make sure you comply with all the priority ones. Also, their site is heavily visited by IE users.

Comment by Rich Pedley

At least once IE7 gets a foothold, it’ll make some things a lot easier. Maybe when it gets to IE11 and Firefox 7 they might actually utilise the current specs… and make out life easier *snigger*

Comment by Bruce

Nothing’s wrong with it, Hank, but there’s a better way. It’s an empty element so isn’t semantic, whereas putting the name as an id on the surrounding element is more semantic.

Compare
<a name="intro"></a>
<p>hello world</p>

with

<p id="intro">hello world</p>

Also note that the name attribute is deprecated in xhtml.

Additionally, the anchor tag a name is still inaccessible in Internet Explorer in some circumstances, so you’d need to expand it to be

<span style="position:absolute;">
<a name=”content” id=”content”>&nbsp;</a>
</span>

Comment by Merri


document.write(“”);
var script = document.getElementById(“__ie_onload”);
script.onreadystatechange = function() {
if (this.readyState == “complete”) {
var links = document.getElementsByTagName(‘a’);
for(var i = 0; i

Well, ok, it requires JavaScript, but well… better to me than breaking validation due to this issue.

Comment by bruce

Hi Merri
that’s an elegant solution to the problem! (For those who don’t speak JavaScript, what Merri’s sample page does is loop through all anchors in the document that begin with an octothorpe (#), and are therefore internal links, and adds tabindex=”-1″ to their destination. The script is contained within a conditional comment, so it’s only served up to IE.

Which is great, but it’s important to note that it relies on scripting (in WCAG2 terms, this webpage would have baseline of html+ JavaScript).

But playing it safe, legally rather than technically, it’s arguably better to adhere to all priority 1 guidelines by using tabindex=”-1″ and not require CSS or JavaScript (which would otherwise break checkpoints 6.1 and 6.3) than it is to retain validation, which is a priority 2 guideline.

That’s because it would be easier to explain to a jury why you break priority 2 rather than break priority 1, which is clearly described:

A Web content developer must satisfy this checkpoint. Otherwise, one or more groups will find it impossible to access information in the document. Satisfying this checkpoint is a basic requirement for some groups to be able to use Web documents. WCAG 1

Everyone’s mileage may vary; the site I’m talking about, being a business-oriented site, has a lot of corporate users who are locked into IE.

You may also think it’s less of an issue, now that Firefox and Opera have support for screenreaders.

Hopefully it’ll be fixed in IE8….

Comment by Daniel Walker

“name attribute is deprecated in xhtml.”

Well, (on a pedantic note intended to aid universal clarity) name is deprecated for those things it should never have been appreciated for, such as anchor tags and form elements, for example. form inputs however (for which the name tag was originally invented) are still supported. Indeed, it would be difficult to label the radio inputs in a radio group (for instance), without you were able to apply both a name and an id attribute to each one, since the label can point to the specific id (while each radio in the group must share the same name, in order to work, of course). Even so, you should really wrap the whole lot in a fieldset element and give them a legend, as well.

http://www.w3.org/WAI/GL/WCAG20/tests/test168.html

Comment by Jörgen Broström

Very interesting reading. I understand that accessibility is getting more and more important, but reading this article I see that accessibility and web standards don’t necessarily go hand in hand.

I was once trained in the arts of nested tables, but I’ve left that behind me now. For good I hope! 🙂

Comment by Jim

Personally I think the WCAG-argument is false. WCAG is not about having to deal with UA-bugs.

Perhaps, but if you raise a barrier for a buggy UA, don’t you potentially run foul of the Disability Discrimination Act, by preventing a group of people from using your services?

Code that validates 100% to a W3C DTD may give the developer a warm, fuzzy glow (oh yeah!) but surely improving usability and removing access barriers is more important? Which is why validity is a Priority 2 guideline – nice, but not necessary.

By the way Bruce, does the rabbit with a cake on its head break the guideline about offering the user control over stopping and starting animations? Not that I imagine it’s sent anyone into convulsing fits, but you never know.

Comment by Tino Zijdel

Jim: I’d say that using invalid markup will potentially just raise issues in other UA’s (mind you that error-handling is undefined in HTML4 or XHTML). Also note that IE itself is technically antiquated software that mainly predates WCAG 1.0 itself.

Besides, how can it be discriminating when there are alternatives available? And in those cases where there is only an IE-based solution available I’d say that’s discriminating by itself and it should be the role of a government body to make sure that there is a platform/UA-independent alternative available or to force UA-vendors to make it possible to adhere to the WCAG guidelines using their browsers without having to resort to non-standard hackery.

The burdon is simply placed upon the wrong parties; if WCAG compliance is a legal issue than browser-vendors should be legally committed to make their browsers fully standards-compliant…

Comment by Jim

Tino: It’s discrimination because you’re not making reasonable allowance to allow IE users to use your site without a mouse. Requiring them to switch to Firefox or Safari in order to use your site, because of a bug in IE, might be seen as unreasonable under the DDA.

Besides, building a site that only works with bug-free browsers (should any exist) surely conflicts with the idea of a browser-agnostic web, where our content is available to any UA? WCAG has to cover accomodating browser bugs, since all HTML parsers are going to have bugs and content has to be accessible to them.

I think Bruce’s general point is that conflicts can arise between WCAG guidelines. When that happens, Priority 1 should take precedence over Priority 2 compliance.

Comment by Tino Zijdel

Jim: so IE-users are now more important than say users of a hypothetical user-agent that refuses to render invalid pages? That’s discriminating too in the same way that non-IE users have been discriminated by many sites for years (and sometimes still are).

WCAG simply cannot accomodate for every bug in every browser, and hacking to accomodate a prio 1 in browser A could cause failure to comply to another prio 1 in browser B (besides the prio 2).

So now what’s more important? Suddenly it seems that somehow marketshare is a factor too (even when it is well known that the dominating browser is technically outdated). What if it weren’t IE that had this particular bug but a minority browser? Would you than still give up on the prio 2 guideline (if you were even aware of this problem in that particular browser)?

Sure I know what you mean and to some extend I do agree, but if this presents a legal problem than pressure should be on the browser-vendors to solve these kind of bugs to make it possible to comply to both the prio 1 and the prio 2 guidelines.

It is simply unreasonable to expect from webdevelopers to know about every single browser-bug that may affect accessibility (I didn’t even know about this particular issue in IE until I read this article) and it is even more unreasonable to expect them to work around every single one of them.

Comment by Bruce

Tino said:

The burden is simply placed upon the wrong parties; if WCAG compliance is a legal issue than browser-vendors should be legally committed to make their browsers fully standards-compliant…

Very true, Tino. I’m helping contribute to an article (for imminent publication) that makes exactly this point. The browser manufacturers are putting too much burden on developers to compensate for browser flaws. It’s the developer’s job to abide by WCAG; the browsers should abide by the User Agent Accessibility Guidelines.

But it is somewhat idealistic to expect that all browser manufacturers will do this. In the meantime, is it worth sticking to your guns and potentially locking out IE users?

That’s what I do on this personal site – but it would be career-limiting to expose a client to that risk, particularly when I know that they have a lot of users on locked-down, IE-only corporate desktops.

Comment by Jim

Re. locking out IE – would this affect screenreaders too? My limited understanding is that IE exposes more info to screenreaders, via MSAA, than other Windows browsers. If that’s true, then should we assume that the page navigation needs to work in IE, if it’s to be accessible to the majority of screenreader users?

Totally agree about UAAG compliance, or lack thereof among all browsers – banging my head against a wall at the moment trying to handle keyboard support because both Firefox (on Windows) and IE handle keypress/keydown events differently to every other browser in the universe. I don’t think either follows the W3C standard for keyboard events. Firefox on the Mac does things differently, though, which is weird and frustrating.

Does your article mention a need for W3C specs to be, well, more specific if they’re to be useful to browser manufacturers? None of this ‘a user agent may do such and such (or maybe not. Up to you really. Don’t let us tell you what to do)’ stuff.

Comment by Merri

A question: is there any kind of statistics of how many of people who use screenreaders have JavaScript disabled? Because as for regular people, JavaScript usage is going up strong each passing year (was it W3schools where I read the JS stats). Of course, statistics are misleading, but it would be nice to know if it happened to be something like 99% or even more…

Comment by Tino Zijdel

As to locking out IE-users: I’d say it depends on the site in question if and to what extend a problem such as this would affect IE-users (or UA’s relying on the IE-engine) that depend on keyboard-navigation. Maybe this is just blowing up a potential problem for a potential group of users just by the mere knowledge of this particular problem.

Probably if you dig deep enough you’ll find many more simular problems that you’d need to workaround too, and with each workaround you risk causing problems for an entire other group of users. It will much become like walking a mine-field. That’s why I am so against this kind of hackery, because it is a quest with no end…

Comment by Ben 'Cerbera' Millard

Bruce, are you allowed to link to either of these websites? If so, there might be some helpful script kiddies reading this who could try finding a tidier solution which validates. 🙂

My understanding is that fragment links work in IE6 if the parent of the fragment hasLayout.

For example, Calthorpe Park School uses an <a> for the Site Menu since its parent <h2> hasLayout. For the main content, an id is applied directly to the <h2> since hasLayout is false (0) for that element but is true (-1) for the parent <div>.

If you could supply links to the pages, I could trace what the value of hasLayout and apply the compatible technique. Or you could use this knowledge to try sorting it yourself.

Comment by Bruce

Hi Cerbera – unfortunately, like my ex-girlfriends and my family, site owners don’t want anyone to know I’m associated with them so I can’t provide links.

There’s three inter-related reasons that I’ve chosen invalidity over “doing it right”:

  1. he target for an inpage link is completely arbitrary; sometimes it’s an h2, and h4, thee third paragraph in a div, an unordered list, the third list item inside an ordered list inside a definition list … Almost impossible to ensure that the destination’s parents always have layout, while tabindex=-1 ensures that the destination does
  2. Those who provide content aren’t webheads abnd can’t be expected to trace back layout; easier for them to add tabindex=-1
  3. I need something reasonably straightforward for my successor when I go on to other projects; understanding hasLayout is a black art of necromancy. (I also have a gut feeling that it’s at the heart of the Trident engine that IE uses, so won’t be fixed until they rewrite the rendering engine completely. But that’s just speculation on my part.)
Comment by Ben 'Cerbera' Millard

Hi Bruce, thanks for replying to my rather late comment. Here’s an even later response!

The fiddlyness of making the correct way work in IE is definitely a pain in the arse. And you’re right; regular folks creating fragment links to arbitrary locations will very often to fall foul of this bug in IE.

For fragment links to fixed locations (such as “skip to blargh” links) the resident web-head can make the right way work by understanding the hasLayout effect. So a lot of websites can avoid the invalid method entirely to feel all warm and fuzzy about themselves.

Using invalid markup to correct a browser bug is very hackish. It’s Microsoft who should be busting a gut to fix this, not you! But when the alternative is poor accessibility for people who have to use a keyboard, I’d agree this is the lesser evil and probably the best choice available in your particular circumstances.

Microsoft made big architectural changes to the rendering engine for some of the CSS bug fixes. Whatever internals cause this issue, they could fix it if they chose to.

Leave a Reply

HTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> . To display code, manually escape it.