Accessibility: Not making friends or influencing people
In the spirit of cooperation I publically invite SiteMorse to get in touch and to work with the ATF with the aim of providing developers and their clients more accessible solutions.
You might imagine that accessibility specialists are slightly odd folk. Close your eyes and imagine them sitting quietly in the corner of a pub, sipping mild and wearing Hush Puppies. On a crazy night they might break out the box of dominoes and make remarks about how wonderfully accessible those little blighters are. They might even grumble quietly about how inaccessible a bag of pork scratching is, but I digress.
These normally quiet accessibility fellows have got themselves in a bit of a tizzy in the last couple of days over the subject of automated accessibility tests and in particular the activities of a company called SiteMorse, whose automated software-based checks are aggressively marketed to public and private sector bodies through a campaign of regular PR.
How can everyone else be expected to achieve website accessibility, if the experts can't?,
With their latest press release, SiteMorse have made many accessibility fellows spit out their beer by publically criticising many UK based accessibility specialist companies.
SiteMorse once again tested the 'leaders' of the field to find large gaps in what these companies claim to be their expertise and in their own capability to deliver.
Their press release names names in what has been seen by many as ungentlemanly conduct. Several subsequent articles have shown the depth of feeling against SiteMorse, their product, testing methods and marketing/sales strategy.
- Accessify Forum: SiteMorse 'report' (?)
- Accessify Forum: SiteMorse rails at the DRC
- The Watchmaker Project: SiteMorse would fail their own accessibility test
- Isolani: SiteMorse fails due diligence
- Isolani: SiteMorse gets nasty, accessibility is the victim
Holding hands across the table
I do not intend to comment either on SiteMorse, their product or their testing methods, although I will say that I find their negative marketing to be very much against the spirit of cooperation so encouraging within the accessibility community.
Many commenters have raised concerns over the secretive nature of the SiteMorse tests. As Mike of Isolani explains,
The SiteMorse product isn't publicly visible. There's no mention of pricing on their site. The website itself makes a series of unsubstantiated claims. Also, there's very little sign of the tool being peer-reviewed by accessibility experts. The SiteMorse product itself looks to be closed source, so its impossible for experts to analyse the logic to determine how accurate its automated test functions really are. The expertise of the developers is limited to a cluster of superficial questions to Usenet groups.
While I can fully appreciate the importance of a commercial entity to protect its intellectual assets, it is also important for specialists of all kinds and vendors to work together towards a more accessible web. In the WaSP Accessibility Task Force (ATF) we are seeking to work with all vendors towards this worthwhile goal.
In the spirit of cooperation I publically invite SiteMorse to get in touch and to work with the ATF with the aim of working with and providing developers and their clients more accessible solutions. I'll be waiting for the call.
Opinions expressed in comments are those of the individual commenters and do not represent opinions or policies of my company or our clients.
I hope you receive that call.
I have a deep-seated fear that, as governments legislate mandates that their own sites be accessible, a make-a-quick-buck contingency will start to create some very serious rifts amongst those who have worked hard on web accessibility for the right reasons. This sort of nastiness, in both directions, doesn't do anything to assuage that fear.
But your post does. Cheers to you for taking the high road on this.
My condolences and best wishes to all your countrymen in this sad and frustrating week, by the way.
#2 On July 9, 2005 09:08 AM Faruk Ateş said:
Oh wow, that's even more annoying than what the accessibility experts in the Netherlands are doing.
Here, they've gone and put an ad on TV (and are showing it regularly) but it wasn't until only a few months ago that these people had even the faintest idea on what web accessibility is like.
When I first encountered them myself, their site was a nested table-ridden horror with only accessibility details like proper alt attributes and no frames.
The national institute of accessibility in my country still had a nightmare of a frames-website until about a month ago. Now it's no longer frames, but it's still a table-based layout entirely.
Andy- they got in touch yet? I was thinking of starting a series of articles on Accessibility snakeoil salesmen - there's a lot of them about and they're increasing in number.
It seems especially prevalent in government circles to be attracted to push-button products (box-ticking mentality) and it's another great reason for us not to have a section 508-style checklist of legally mandated accessibility.
#4 On July 9, 2005 10:26 AM Andrew Green said:
To add to your worry, SiteMorse are becoming de facto auditors of UK government web sites, publishing monthly league tables of both central and local government web sites according to a number of criteria -- speed, metadata and accessibility among them.
As someone working with a local authority on the continued development of their site, it's hard to avoid these monthly rankings forming the basis of a performance plan, and whilst some local authorities may shrug off the SiteMorse reports, it's also likely that more will optimise their sites for SiteMorse in exactly the way that the accessibility community might fear.
#5 On July 9, 2005 10:30 AM Jeremy Freeman said:
Touched a raw nerve here with me.
I was having a conversation with a Red Ant guy a few months ago - asking why SiteMorse failed a web site, when clearly it was Treble A compliant according to Cynthia. (I've forgotten which one it was).
This was the gist of the MSN conversation I had:
Me - I see there is a connection with Red Ant and SiteMorse
Red Ant - yes - it's interesting... We've been using them for a while, they say we're the first design company to pass develop to the levels needed to pass their tests...
Me - I'm very dubious about their testing.
Red Ant - In what way?
Me - You can take a web site and get site morse to "test it" and they will give you a score. Ask them to test it again a few days later and it will be a dramatically different score. Then, and this is where it gets good, ask them to explain their scoring and the reason for the difference and they can't. Plus they cannot test above AA and even then they don't take into account the basics like alt text content ( as long as something is present they accept it ) and they don't check table use either.
Me again - It's like standing up and saying "You're not accessible, but I won't tell you why!"
Red Ant - true they can not test above aa - but most companies fail to meet 'a'. Their reports I have always found to be complete and accurate, With the exception of the reponse times & download times (which will of course differ) they always find the same errors.
The rest of the conversation went a bit off-track here.
Is there a connection with Red Ant and SiteMorse?
#6 On July 9, 2005 10:35 AM Richard Conyard said:
Well I hope they take you up on the offer. The industry at large co-operating and working together can only be a good thing.
To answer the last question SiteMorse are clients of ours. There is no shared directorship or anything like that. A list of steps we took early 2004 when the site went live, and my personal response to the PR can be found on accessify forum.
@ Jeremy Freeman: Hi mate. Your recent email to the UAWG triggered this post, thank-you.
I don't want to swing off-topic onto Red Ant. This has been talked about a lot (and explained) on the Accessify Forum. Follow the Accessify links above and all will be explained. (Edit: Richard explains a bit above too.)
#8 On July 9, 2005 10:41 AM Richard Conyard said:
Sorry just caught the remainder of that posting, would you be able to let me know who you spoke to? Sending it through to Richard at the domain name will get through just fine :-)
#9 On July 9, 2005 11:00 AM Jeremy Freeman said:
Thanks will check the Accessify forum now...
@ Malarkey - excellant article, I too would like to see you keep it up to date as far as SiteMorse replying to ATF.
Reminds me of when I first got my hands on accessibility testing tools. I could use them in a helpful way or I could use them in a hurtful way.
While I certainly thought of creating a "Top 10 Best" and "Bottom 10 Worst" list based on my tools, I quickly rejected the public "Bottom 10 Worst" list. My philosophy, if I give a damn about making the world more accessible, then I'll put the spotlight on the ones who are doing things right (and give them kudos) while quietly and privately offering assistance to those who are on the "Bottom 10 Worst" list.
And as for a test that gives you a score but doesn't reveal the scoring mechanism...sounds quite questionable to me.
(puts on rose colored glasses)
Perhaps SiteMorse cares about accessibility as much as we do, but doesn't understand the harm they are doing. I hope a conversation (perhaps aided by beer) could help bring the energy of SiteMorse and ATF together.
#12 On July 9, 2005 04:28 PM Matt Robin said:
Andy: Good luck waiting for that call mate!
SiteMorse appear to be the Accessibility Nazis - a sort of grotesque in the Accessibility world that haven't demonstrated any coherent dialogue with the accessibility community to date.
If they can contact the WaSP Task Force and discuss matters, then it would be to their advantage AND it would also benefit accessibility developers to (possibly) make better web solutions for the web users.
I'll have to make another review of RedAnt (although I seem to remember something about them from before)....
#13 On July 9, 2005 04:37 PM Richard Conyard said:
Probably the last review Matt, was 6 months ago and pretty much the same as this one all round :-(
#14 On July 9, 2005 05:24 PM Sean Fraser said:
Whois Source gets you this:
100 Pall Mall
London, SW1Y 5HP
+44 870 759 3300
I'd telephone but I have a hibiscus that needs pruning.
Funny thing, though. Run SiteMorse.com in the Whois directory. It was created 2001-08-02.
Next, run www.business2www.com in the Whois directory. [Humour me.] It was created 2000-08-17.
The funny thing is that each of these sites is identical. Virtual hosting, anyone. Compare the content of the URLs and you'll see. Read the meta:descriptions. Read link URLs.
It's very interesting.
I would like to keep this conversation open and constructive please although I understand that SiteMorse have upset a great many of you.
In the interests of moving this forward, what questions would you like to put to SiteMorse via the ATF? What do you feel needs to be done with SiteMorse and perhaps other automated tests?
#16 On July 9, 2005 06:28 PM Northshore said:
I admit we use Sitemorse - but that's basically because as a local authority they've got everyone running scared by publishing those damn league tables (and may I just say they originally got sent to Chief Executives and not web teams so the first we knew about it was "why the hell are we ranked xyz - you said we were good") and hey presto - everyone started signing up to Sitemorse to try and improve their ratings.
Yeah we use Sitemorse - but not for accessibility....we use it to test for various coding issues we may have - all accessibility testing is done manually - and yes - even though we know a page is AA - Sitemorse fails it.
All validators gives the page a pass - perfect XHTML - Sitemorse fails it.
A site we link to has a problem and their page is slow loading - Sitemorse fails our site.
What's more - Sitemorse only tests the first 250 pages of sites which means you can have 250 pages they think are great and pass, but page 251 onwards can be an utter disgrace.
From talking to colleagues in other authorities - we're all in the same boat - Sitemorse just can't hack it for telling you if your site is accessible (of the 30 tests it lists for AA - it only actually tests 9 - the rest are for you to do manually!) but as its being used as the 'official' method for the government to assess local authority sites - none of us at all is willing to ignore them. Basically because it doesn't matter how good, useable or accessible your site it - all anyone sees is that damn league table every month.
Hmmm, interesting. Never heard of those guys, but it's a shame they seem to ruin their premise (which is probably true) with some overreaching statements and general negativity.
For instance, I am in total agreement that a lot of people who claim to be accessibility experts are more "soapboxers" than "experts", but I just question some of the methodologies SiteMorse may be using to do their tests. Take this line below:
"Since then, SiteMorse�s tests continue to show that basics such as alt text for images (text describing what the picture is) have been continually left out."
I am personally of the belief that most images actually do *not* need alt text. The only time they always need it is if the image links to somewhere. Most other times, a blank alt attribute will suffice. And to be honest, I've always thought *no* alt attribute is more efficient than a blank one. Why screenreaders need a blank one, I'll never know.
So my point is that if SiteMorse is using tests like the one above to determine how accessible a website *really* is, they are probably testing the wrong thing.
#18 On July 9, 2005 06:53 PM Richard Conyard said:
So moving forward how about some of these:
An agreed statement that automated testing, whilst a more than useful part of accessibility testing is only part of the process detailing where is it useful, and the role of accessibility experts and/or accessibility/usability groups to become or to confirm accessibility
A key set of common accessibility criteria and standards checkpoints to meet the minimum of automated tests, conflict resolutions over the more fuzzy items of the WCAG to be either agreed (unlikely), or catered for in the tests by all automated testing companies
A clear split in reporting of accessibility, standards compliance (especially if WCAG2 discounts standards), speed / size and value added tests
Companies agreeing to the agreement to be noted within the WaSP site
An agreed statement of intent that all companies will work to the improvement and greater understanding of web accessibility issues within the marketplace
I don't believe for a minute everyone in the industry will get on, but maybe those would be a start.
#19 On July 10, 2005 06:28 AM Jude said:
Better understanding of the pros and cons of automated testing is vital in the gov / local gov arena. More openness and less attitude from SiteMorse would be a good start.
I second what Northshore said - my company develops a site for a UK local authority so I have to deal with these SiteMorse league tables monthly, and have no end of trouble with them.
Is it just me or does it seem wrong to test an entire site every month, build a league table which you claim is authoritative, and then make people pay to see a full report? Especially when we're talking about every local council in the UK. And even more especially when the report is *not* authoritative and contains numerous bugs.
The league table itself is very scant on information, and the free test is woefully inadequate (and even that requires registration now).
My favourite bugbears...
- They seem to rank sites primarily on the reponse and download speeds, which I don't see how they can possibly calculate fairly (in our case it seems we're paying very heavily even though speed is more than acceptable in normal usage).
- Their definition of what constitues an 'Error' and a 'Warning' - I've had tests run where seperating META keywords with spaces instead of commas is an error (care to show me an authoritative spec someone?), while invalid xhtml just triggers a warning (in this case we found they were validating against a different DOCTYPE to the one we were actually using... I wasn't surprised by this point).
- Last time I checked the user agent string was indistinguishable from IE - presumably to stop people serving different content to 'fix' their results, however there are times when I'd like to know what and how they're checking so I can improve the site in general. There are more effective ways of stopping 'cheating'.
IMO the ideal situation would be a public, standardised accessibility checker that everyone could agree on and use. Forget trying to automatically check the stuff that no-ones invented the tech for (i.e. the manual checks). It can't be done yet, so don't publicly rate sites based on incomplete tech.
This is the sort of tool that should be out of the hands of an individual company and made open source (much like the w3c validators).
Failing that, SiteMorse *really* need to open up their methods - the way they work at present is simply unreasonable and makes everyone (including themselves) look bad.
I've just checked sitemorse.com and it seems they're providing a more detailed free report as of 13th July. Lets hope they also allow multiple users to register the same site (one of my sites is lost in the system and I can no longer get any reports on it, and can't re-register it under any username).
#20 On July 11, 2005 09:54 AM Grant Broome said:
Despite my anger at siteMorse, I'll try to be objective in my response.
The way they criticise everyone else in web accessibilty is terrible because it fosters bad feeling and adds a lot of confusion, but this isn't the worse thing they do.
The most damaging thing that sitemorse do is publish league tables. Those who don't get the full picture believe that these league tables are an actual representation of the accessibility of thier site. It's simply not true.
I've spoken to plenty of Public Sector people who do ridiculous things to optimise their site for sitemorse league tables, including stripping the headings from their entire site. In some cases the knowledgeable accessible developer is told by his manager that only the rankings matter.
The situation is ludicrous.
But sitemorse's league tables are their best marketing tool, so it's unlikely that we'll convince them to stop publishing them.
#21 On July 11, 2005 02:36 PM Mike Stenhouse said:
I think we're back to the question of education again and the mere existence of automated tools such as Bobby and Cynthia is largely to blame. Both these tools state clearly that they are not the last word in accessbility but how many people read the instructions?
SiteMorse seem out of line, claiming to be able to authoritatively assess accessibility when our best set of public guidelines clearly require human input, but without better dissemination of what accessibility actually means there's very little anyone can do about it. They've carried off a very shrewd marketing campaign, and if they hadn't done it, someone else would.
I was asked to write accessibility exams for a big recruitment agency here in London a while back but I couldn't think of a way I was happy with to assess this sort of thing on a right/wrong basis. The grey areas in the guidelines and the intended audience allow for far too many sensible answers.
In my opinion, the word needs to go out that validation is a useful tool but it's not an end in itself.
#22 On July 11, 2005 04:11 PM Robin Massart said:
Never heard of SiteMorse until I read the article on the WASP site. I ran our company's site and found that we only scored 5/10, despite getting 10/10 on WAI, but 2/10 on errors because two links to favicon.ico files were wrong! This is absurd.
Given that SiteMorse is obviously publishing league tables on incomplete validation tests and heavily marking sites down for errors (even for invalid links to minor content that provides no accessibility gain) , there must be a way to prevent this from happening.
The ATF probably has enough on their hands as it is, but perhaps the next step is start discussions with UK government itself. They should be made aware of a number of issues such as that SiteMore's tests are incomplete and therefore produce invalid rankings and that these league tables give SiteMorse unfair leverage in gaining local government clients. Imagine if the school league tables were published by a private company which then offered services to improve schools! Any league tables published should be done so by independent authorities with no other business interests.
I feel these sort of unaudited league tables are in danger of destroying the hard work put in by WASP and others over the last decade or so to move the web in the correct and accessible direction.
#23 On July 11, 2005 04:29 PM Robin Massart said:
@ Richard Conyard
Could you please explain why on the Red Ant Design site, SiteMorse is listed under the heading "Valid HTML/XHTML - as defined by"?
I wasn't aware that SiteMorse defined the standards for HTML or XHTML.
I work for a company who are trying to make our client's sites accessible and compliant and am concerned that I've missed something.
#24 On July 11, 2005 04:37 PM Thomas Passin said:
Speaking of page validation, I tried out the new Firefox validation extension on this page (just for fun, I had just installed it). Turns out the comments allow a bare ampersand character. It occurs at
"With the exception of the reponse times & download"
I have written out the entity for the ampersand, but on the actual page, it is a bare ampersand, not escaped.
#25 On July 11, 2005 04:53 PM Richard Conyard said:
They were used as a validator. I don't know why the guy who wrote the content wanted to put them in there, but two years ago when the page was written it obviously seemed like a good idea.
Firstly, if an effort were made to persuade those publishing the league tables that the SiteMorse tests aren't as useful as they seem, it would need to be directed at the Society of IT Managers, usually known as SOCITM.
Secondly something we've discovered about SiteMorse's stats.
Take a page which does almost everything to comply. Not everything but, say, 90%. It's nearly there. Because it doesn't comply fully, SiteMorse marks it as a fail. If that small error is repeated across your site, your whole site fails. Your managers are told that you are failing utterly to meet government requirements, when in fact you're almost there.
Technically that interpretation of the stats is a correct one, but it's also a wilful misrepresentation of the true situation, no doubt designed to drum up business. Stating that 90% of requirements were met would be equally correct from a statistical point of view, and give a more realistic view of compliance.
We discovered this in relation to e-GMS (metadata) rankings, but doubtless something similar happens across the board. Certainly our site fails to meet AA in their tests largely due to an utterly insignificant validation error introduced onto every page by ASP.net.
Many thanks for all of the constructive comments. I have a request to make. Would there be someone out there who works for a UK council or similar body and who is a SiteMorse customer (receiving their full report) who would send me a copy?
If the report is web based, a PDF rip would be just fine. Of course the report and the provider will be treated in the very strictest confidence. Please send it to me at the usual address or phone me on the usual number if you can help or if you have any concerns.
First, hi Mike!
Second - you ask about the reason for using a blank alt attribute rather than not having one at all. I agree, it seems superfluous. It should be optional - up to the author's discretion. If the image requires an alt attribute then the author should add one that is appropriate. However, the fact is that the people who benefit most from alt attributes - the blind community - are using tools that are not universally clever enough to simply ignore a missing alt attribute. Some will try to supplant the missing alt with whatever else it can get its 'hands' on, which may result in the screen reader saying "image, image, image, image" or, much worse, reading out the filename. Anyone who was at @media 2005 will attest to how bad that can get - go take a look at the filenames for images on Amazon.com and imagine those filenames being read out!
One of the challenges for the WaSP ATF is to get into more discussion with the screen reader vendors about issues such as this. It seems, to me at least, more sensible for a screen reader to ignore an image that has no alt attribute rather than make a botched second-guess (as they tend to) instead. Can we influence them to change that? Maybe. Possibly not. Either way, any change is not going to be used by the majority for a long time. Unlike browsers, screen readers are not free and are updated by the user infrequently - there has to be a very good readon for upgrading.
So, to summarise, empty alts are a bit sucky, but we should expect to be stuck with them for a long while yet.
Oh, and of course there's the small matter of the alt attribute being a compulsory attribute according to W3C recommendations, so even if screen readers were clever enough to work out what to do when no alt attribute is present, we'd stil be failing validation.
#30 On July 12, 2005 02:04 PM Northshore said:
You may be interested to know that this debate is also raging elsewhere - http://www.publicsectorforums.co.uk/page.cfm?LANGUAGE=eng&pageID=1289
- and they link to a lot of the original articles on this issue.
#31 On July 12, 2005 02:12 PM Chris Hunt said:
Surely [img alt=""] is explicitly telling a reader "here is an image, but if you can't see it don't worry about it". [img] just says "Here is an image", with no alternative specified screen readers have to do the best they can.
I'm not sure what to think about the Sitemorse thing. I've just registered for the free service on my sites, the reports seem pretty good as far as they go. It's also valid to point out cases where accessibility experts don't "eat their own dogfood", if that really is the case. Mind you, they'd do better to put their own house in order first. Table-based layout anyone?
What are they saying, Northshore? I don't have a public sector email suffix so can't view it.
Would be interesting to see if they want to put pressure on SiteMorse, too.
#33 On July 12, 2005 03:50 PM Northshore said:
Source - Public Sector Forums
Another hornet�s nest of discussion and disgruntlement in the web accessibility world has been stirred up once more by the makers of SiteMorse, famous for their monthly league tables of local authority website performance, function & WAI compliancy rankings. PSF takes a quick look
Last week, SiteMorse the remote testing tool company issued a statement suggesting a number of organisations offering accessibility advice had �large gaps� in their �expertise� and asked �How can everyone else be expected to achieve website accessibility, if the experts can't?� after having run its own tests over a series of sites belonging to �leaders in the field�. Organisations such as the RNIB, DRC and a plethora of commercial companies offering accessibility services were featured.
The statement is the second such anthology of criticisms SiteMorse has levelled at these organisations, and predictably - like its predecessor described six months ago on PSF � caused major grumblings among those at work in the sphere.
Having reviewed some of the ensuing discussion we feel this is an episode worth covering because some of the technical and other issues raised by the assorted correspondents contain valuable educative points likely to be of use to those in the public sector tasked with ensuring the �e-accessibility� of their websites. And while neither condemning nor condoning SiteMorse�s remarks we do feel debates such as these are themselves helpful in providing greater and more detailed insight into the specifics of web accessibility.
Here then are some of the recent comments and takes on the latest blast from SiteMorse, together with the �offending article� itself.
They all make for an interesting read.
Links to all articles Malarkey links to above
#34 On July 12, 2005 04:05 PM Vincent Marcello said:
I still feel what SiteMorse did was to garner publicity for themselves. I recently wrote something about this on a blog and a person replied stating that SiteMorse is correct in pointing out errors on 'experts' websites.
What they faield to grasp is two key things here. One of them being the publilcity angle that SiteMorse was after.
The other that SiteMorse did not contact the persons they were mentioning in their report which would have given organisations such as GAWDS a chance to either refute or fix any 'so called errors' IMO this was wrong and I can only conclude that it was done partly as a publicity seeking attention stunt.
Chris wrote: "Surely [img alt=""] is explicitly telling a reader "here is an image, but if you can't see it don't worry about it". [img] just says "Here is an image", with no alternative specified screen readers have to do the best they can."
Precisely my point. Without the alt, screen readers can only guess, hence the reason for having to specify an empty alt. My point is that I tend to agree with Mike Davidson that an empty alt seems a little superfluous; it would be nice if we didn't have to enter such empty information into tags and the screen reader (or other assistive device) were capable of handling it appropriately, but that is just the way it is - machines are still basically stupid. It's a required attribute, so we'll all continue to enter alt="" for a long time yet.
Vincent, #34, you may be talking about my assertion that the DRC has no excuse if their site doesn't validate. Regardless of SiteMorse's motive (and you'd have to be dafter than me to fail to grasp that) or how minor the error, it's an irrefutable fact.
Of course SiteMorse only did it for the publicity, and I never suggested they were correct in publicising the errors, but take them out of this completely and you're still left with the fact that many prominent websites related to accessibility and web standards don't validate. Is that a serious problem? Judging by the way we push the importance of validation I'd say so, since valid markup is a sound foundation for an accessible site.
#37 On July 13, 2005 08:18 AM Northshore said:
And this is what all the hoohaa is about for the public sector - the monthly league tables - here's the latest hot off the press.
So how many webmasters are getting their arses kicked at the moment because this lands in Chief Executive's inboxes - and no matter how much you try and explain its failings or inconsistencies etc - it doesn't matter - "we need to be higher next month or else" - queue much rejig of sites - to pass the ranking, not to improve accessibility or usability.
Spot on there Northshore.
I launched our Council's redeveloped website in March, and there was no interest internally in the fact it used web standards and had vastly improved accessibility over the old site. When the SiteMorse league tables emerged for April it was suddenly good news since we'd jumped up so far in the table.
The fact is that people like to measure things, especially in government, and these tables provide what looks to those who know no better like a way of measuring the relative quality, performance and accessiblity of websites. Of course we all know better, but communicating that to non-web folk is very difficult.
Having said that it is true that a well-structured, accessible site using valid, semantic markup will achieve a pretty good SiteMorse rating. However, a good SiteMorse rating most certainly does not equate to an accessible site...
#39 On July 13, 2005 10:23 AM Chris Hunt said:
Ian: There's a difference between saying that an image is unimportant and not saying anything. If you don't say anything, the screen reader can't just assume that the image is unimportant (well, it could, but it would be wrong as often as not).
It's like the distinction between null values and zero-length strings in programming practice - there's a difference between cases where information is absent, and cases where it's present but has a value of "nothing".
#40 On July 13, 2005 11:08 AM Vincent Marcello said:
Hi Dan (36), yes indeed I was refering to your post on So-Net. As for me being dafter than you (there is a strong possibility) that is purely subjective ;-)
I agree accessible experts websites should set an example and validate if the claims are there. At the same time every error cannot be foreseen in a medium that is till relatively 'cutting its teeth' now SiteMorse as contacted the DRC before about their errors, fair enough, but isn't this whole business about SiteMorse 'trying to gain publicity for themselves and a sour grapes issue over the DRC rejecting them on another issue?
I agree. I agree, agree, agree. You have to tell the dumb old screen readers what's happening, and I know that there's a difference between the two scenarios. But what I also agree with is Mike's original point about it being nice if you didn't have to enter an alt, but no screen reader is clever enough to deal with that. So, alt attributes are a must. Please don't misunderstand me - I'm not suggesting people don't use them, god forbid. I'd rather cut my own arm off than do that. Well, maybe that's agoing a little too far ...
#42 On July 13, 2005 02:31 PM Chris Hunt said:
But it's not a question of screen reader cleverness - you don't have to be clever to treat the absence of an alt attribute the same way as you'd treat alt="". You just have to be wrong.
The screen reader examines image attributes to answer the question "what, in this image, is important to a blind visitor?". <img> tells them nothing, <img alt=""> tells them "nothing". There's a difference. No amount of cleverness (short of some super-AI image analysis routine) is going to enable a screen reader to evaluate whether an image important or not.
Maybe one day content providers will be so good at marking up alt attributes for those images that need them, that we'll no longer need to explicitly say "I've thought about this and there's no reason for screen readers to worry about this image". Absence of an alt attribute really will be equivalent to alt="". It's the writers that need to become cleverer, not the readers.
Thanks for sharing that league table link with us Northshore. I was wondering if it was visible anywhere without asking SiteMorse. It's really weird though - compare top-ranking Thurrock ( plug-ugly, not-great markup) with "null points" Dungannon ( dull-but-ok looking, terrible markup). I don't think a human evaluator would mark them too far apart. They could well rank Dungannon higher. Just shows how daft the whole thing is I suppose.
Thank-you for all your thoughts and comments. I'm going to take the liberty of closing replies now (Ed says: Who do you think you are Malarkey, Lord of Comments?)
I'm sure you're going to see very positive things happening.