TV version (Display Regular Site)

Skip to: Navigation | Content | Sidebar | Footer


Weblog Entry

The Standards Police

June 09, 2004

The Standards Police will get you! — Brian Garside offers advice for those who would validate others work and follow up with an email pointing out the errors: don’t.

Further from Brian:

“It doesn’t help anyone, we’re all aware of our defficiencies [sic], and could probably point 100 more out on top of the 10 you point out. Not only that, but you have no idea what kind of conditions we’re working in, or what else we’re trying to do (or the fact that we’re building a whole new standards compliant, CSS based site behind the scenes, but Rome wasn’t built in a day…at least it wasn’t built properly in a day).”

Instead of spending the time it takes to hunt down a site owner’s email address, compose your message, and post about it on your weblog, why not spend that time doing something useful? Write a how-to article on validation. Contribute bug reports to the major browsers. Volunteer to build a site for a non-profit group. Find something that will add value to the overall message of web standards, instead of detracting from the work of others.

The time spent sending bad vibes and turning site owners off of your message can be harnessed for better uses. Why not try?


Reader Comments

June 09, 04h

The only circumstance in which I am routinely tempted to validate someone else’s pages is when I encounter an article on Validation or Web Standards, or the business site of a designer who prominently claims to code to Standards.

That’s when I hit the old W3C bookmarklet.

Mostly, this is to reinforce a smug sense of superiority. I almost never write the author …

Dave S. says:
June 09, 04h

God bless your twisted little heart, Jacques.

(that’s the only time where it’s warranted, and even *deserved*)

Keith says:
June 09, 04h

You know, I’ve always wondered at the people (no offense Jacques) who check other people’s sites for validation and then complain (or even remark) about it.

I mean, my site isn’t 100% valid all the time (cough, comments) and I’ve gotten my share of this kind of mail and feedback and I’m always left asking myself why these people care so much about my site’s validation.

I guess I could see making a remark if it was way off, but these people who nit-pick the small stuff need to find something else to do.

You can write about Web standards, code with Web standards, be all for validation and still not have a 100% valid site – nor should you be expected to. Total validation is very hard and takes quite a bit of effort.

Validation doesn’t pay the bills, it doesn’t mean your site will work cross browser or provide a better user-experience. Validation doesn’t make your content better and it doesn’t make you a better designer or developer.

All validation does, right now anyway and from a practical standpoint, is help you troubleshoot problems and assure the quality of your code. IMHO, it’s not that big a deal. Important, sure, necessary, no way.

The day an unencoded ampersand causes any real harm is well off, if it ever comes.

Validation police? You know what Cube says – F the police. ;)

Matt says:
June 09, 05h

I strive for all my sites to comply with the spirit and letter of web standards as much as possible. Mistakes happen, and if a page is invalid I truly appreciate it when someone emails me letting me know. It’s usually a 30-second fix anyway. Why the pushback?

Keith, if you don’t believe that standards have intrinsic value, that is, value beyond merely what they can “get you” then why bother at all?

AND REMOVE THE VALIDATION LINKS FROM YOUR PAGE.

(Sorry, but aren’t those links just an invitation for people to check you? If you don’t want that, don’t have them.)

5
Eli says:
June 09, 05h

Gotta disagree here. I think most people who work on the web don’t know how badly their sites fail validation (I know I didn’t before a couple of years ago.) Taking the time to inform a site’s developers about errors should be looked at positively…after all, they’re trying to help you make the site better.

Also, there seems to be an idea out there that validation never reveals anything useful, anything that actually breaks the display in one browser or another. That is simply not true. Validation has helped me quickly track down display bugs many times, especially in data-driven sites where noncoders may be generating content.

So keep pointing out the validation errors, but do it in a constructive and respectful manner.

Paul D says:
June 09, 05h

Leaving aside the 100 reasons for using bad code, or the 101 reasons for using valid code, I see many reasons that pointing out invalid code can be helpful.

1. The website owners may not realize their site has fragile code. If a brick-and-mortar store’s sign was burnt out, or their parking lot difficult to navigate, wouldn’t you consider mentioning it?

2. Analyzing a website’s code is a valid instructional tool. See Joe Clark’s analysis of Canadian election websites for an example.

3. A website may not work properly in your standards-compliant web browser, and pointing out the invalidity of the code may help the owner fix it.

4. If someone wants to improve the Internet by promoting XHTML and CSS, and that involves complaining about your website, so what? Take it like a man (or woman as that may be).

That said, my own outdated site isn’t too standards-compliant, so don’t bother analyzing it. :)

Keith says:
June 09, 06h

Matt – No need to yell. WTF was that all about?

I’ve got those validation links there because, much like you I strive for validation on my sites as much as is practical. Do I achieve it? 98% of the time – yes I do. Do I need to hear about the 2% that doesn’t validate, especially when it’s non-consequential. No, thank you.

It gets really old, really quick, and luckily it’s been awhile since I’ve had to deal with it. I took steps to insure a reasonable level of validation and the only problems I have now are very minor and, frankly, not worth the effort to tackle every time.

There are validation errors I should know about and then there are those that make little difference. 30 30-second fixes turns into quite the pain in the ass, don’t you think?

Also, those validation links aren’t an invitation to check my site, find a little error and proceed to call me a hypocrite. Those damn links have been discussed to death, so I’ll stop there.

Anyway, I never said that I “don’t believe that standards have intrinsic value”, I wasn’t talking about “standards” at all. I was talking about validation specifically.

If you’re going to quote me, get it straight, please. You make it sound like I’m not an advocate for Web standards which of course I am. I guess that is the root of my problem with “standards police” – it’s like if you don’t validate 100% you shouldn’t advocate.

I don’t mind when “people check me” – what gets me is when someone checks finds something like an unencoded ampersand then proceeds to rip me for not being valid and at the same time preaching Web standards.

That, my friend, is BS. And it’s my experience with people giving feedback on validation. It’s not usually like Paul D says. It’s not usually friendly.

I kind of thought that was the point of this topic. You know “Your site doesn’t validate, I caught you! You nasty non-validatior!”

Sure, if I’ve got a major error on one of my sites and someone wants to take the time to validate and let me know the results in a helpful way, that’d be just fine.

Pointing out invalid code can be helpful – or it can be a nit. I’ve had way more nits than I’ve had helpful feedback on validation.

Guess I’m bitter. Sometimes I wonder why I bother…

June 09, 07h

Personally (and this is just me), I don’t have a big problem with people emailing me letting me know if a certain page (or rather, blog entry) on my site doesn’t validate. I strive to make all my pages validate, anyway, and like to fix any errors that I run across. However, it does rubs me the wrong way when they go so far as to post a comment to the related blog entry to point errors out. (These type of “corrections” seem more pertinent to typos - you know what I mean, Keith). That’s not the place for it.

However, I rarely ever validate someone else’s code. Personally, while I do my best to advocate standards-compliance and CSS-based design, I have better things to do than to check if your code validates. I’ll leave that to you, and will realize that there may be a good reason if it doesn’t.

Perhaps I’m riding the fence a bit, here. Basically if someone points out an error with my site, I’ll fix it, but I won’t go around validating your code and riding your backs…

June 09, 07h

Keith wrote:

“The day an unencoded ampersand causes any real harm is well off, if it ever comes.”

‘Scuse me?

Compliant XML parsers are *required* to stop processing at the first unencoded ampersand (or other well-formedness error). So, whether you are producing an Atom feed, doing Web Services, or (yes) serving XHTML with the correct MIME type, your content had *better* be well-formed.

Producing valid content is 95% having a good workflow in place. It has very little to do with “repairing” stuff after the fact.

(Unencoded ampersands, Window-1252 characters … those are, or so you say, the bulk of the errors on your site. Why are they even there? Those are fixable programmatically, and *should* be fixed *before* they ever make it onto your site. There’s no need to bellyache about how tedious it would be to fix them by hand afterwards.)

A web designer who promises Standards-compliant work ought to have a workflow commensurate to producing Standards-compliant output. If their own site doesn’t validate, what does that say about the work they do for clients?

(And putting up “Validate” links on a site that they *know* is invalid is bordering on self-parody.)

As I said, I don’t usually contact people whose sites don’t validate because I figure that, in most cases, they don’t actually *care* (though you’re the first to say so explicitly and prominently on your site).

If it’s a friend, who I know *does* care, I might send them a friendly “heads-up.” And I expect them to do the same for me (indeed, I tend to get *piles* of mail on those rare occasions when something on my site doesn’t validate).

Matt says:
June 09, 07h

Keith, you’re right. Sorry for the harsh words, they weren’t really directed at you as much as they seemed. We all try hard and some days are harder than others. I still appreciate when people email me validation errors though, even if it can be annoying at times.

June 09, 07h

Hi,
I’ve been guilty of this. I validate many (not all) sites that I visit, just for something to do, but I don’t let every single one know that they’re invalid. There have been a few (about 3 or 4) that I’ve sent polite emails to over the last few years, letting them know the problem, but generally only for sites that have just gone through a complete rebuild, and generally only if it’s a serious problem.

This reminds me of the time when W3C’s Validator failed to validate, which I discovered while attempting to show a friend how a standards compliant site validates. I let them know with a polite email [1] explaining the problem, and it was promptly fixed in a few days. Sure, this was a minor error, but since it was the validator, it made them a little hypocritical. Ordinarily, I wouldn’t report such a small error to anyone.

Finally, for my website, since (when it’s finished) all documents will be served as beautifully styled XHTML 1.1 serverd as application/xhtml+xml, I have no choice. Either it validates, or it’s unreadable. For IE users, or any UA that only supports text/html, I’ll be delivering virtually unstyled HTML 4.01 or XHTML 1.0 strict. So, I would expect visitors to let me know if they can’t view a document, but I’ll usually validate before posting anyway, so that’s unlikely to happen.

[1] http://lists.w3.org/Archives/Public/www-validator/2003Aug/0004.html

June 09, 08h

Lachlan Hunt wrote:

“This reminds me of the time when W3C’s Validator failed to validate…”

I know all about that! One of the things I do is run comments through a local copy of the W3C Validator. It was a bad thing when people couldn’t “preview” their comments because the Validator was producing ill-formed output. Fortunately, that was easy enough to fix …

I wrote:

“Producing valid content is 95% having a good workflow in place. It has very little to do with ‘repairing’ stuff after the fact.”

To illustrate this, let me point you to a site I set up six months ago for some young colleagues of mine:

http://golem.ph.utexas.edu/string/

I designed the site, and tweaked the CMS software (MovableType). But *they* produce the content. I have nothing to do with producing the content or checking it for validity. *None* of them knows or cares about Web Standards. But, as far as I know, every single page of theirs is valid XHTML.

If *they* can do it, anybody can.

June 09, 08h

“If *they* can do it, anybody can.”

Jacques, this isn’t true. For instance, this works for a smaller scale sight by tweaking a tool that was built with standards in mind (MT). However, many people are tied to large CMS’s that produce some nasty stuff, even after tweaking.

For instance, over at the CSS Vault this week a site was featured that was developed around a CMS, and they did their best, but some things didn’t validate. The first comment was to the effect of “His site doesn’t validate, but don’t worry, I e-mailed him about it.” The commenter came from a guy with no URL or work to show for himself. Is freelancing as an anonymous validator a profitable gig now? I hear time and time again where newcomers to the standards scene get so turned off by the first comment out of the box saying, “It doesn’t validate” with a “smug sense of superiority”.

I think I am a little too busy over here in the real world. Oh and by the way, I think Keith is too. Maybe we should say thank you for the site he produces on his own time for nothing other than spreading the word.

June 09, 08h

Brad Daily wrote:

“However, many people are tied to large CMS’s that produce some nasty stuff, even after tweaking.”

Obviously, in a production environment, where you don’t get to choose your (CMS) tools, and have no control over what the client supplies as content/data, there’s not much hope. You do your best, and that’s all anyone can expect.

If, however, we’re talking about a personal or business site, then you *do* have a measure of control. Why would you opt for a crappy CMS to run your own site?

“Maybe we should say thank you for the site he produces on his own time…”

No doubt.

As I said, I don’t go around haranguing people about whether their site validates. There’s no *point* to doing so.

Keith says:
June 09, 09h

Matt – No worries. No harm – no foul. I think I see where you are coming from, but I think we need to remember there are all kinds on the Web and for some, things like standards are hard enough to grasp without waving around validation as well. You know?

Jacques – you make a good point, however, the majority of my work involves design and content for display in a browser. And, for the record, the unencoded ampersand thing was an extreme example, nothing more.

Considering IE’s support for Web standards the fact that I even make the effort at all should be worth something. I mean it is, by far, the major delivery method of my work.

But I guess that’s not good enough for you? Give me a break.

If unencoded ampersands on a Web page choked IE the whole damn Web would break. I think we have some time before validation a real world concern. Let’s just tackle proper support first, shall we?

Valid Web pages are the exception right now, not the rule.

I support Web standards more than most, I strive for, and care about to a certain degree, validation and I do better than many.

I wouldn’t be having this conversation if I didn’t care some about validation.

But for crying out loud, it’s that “smug sense of superiority” that keeps people from wanting to bother with Web standards at all. I know I’m damn near the end of my rope advocating for standards.

I mean I ask myself why I bother…

My words aren’t good enough for some, fall on the deaf ears of others or don’t show enough value in Web standards to the rest.

Try being an advocate a voice for standards in Microsoft’s backyard. Most people around here just don’t give a damn and it sure as hell doesn’t help me get work.

I try to look at Web standards in a real world, practical sense. It seems all too often that “real world” and “Web standards” don’t belong in the same sentence.

I’ve been a Web designer and developer for about 10 years. I know people who’ve been at it, as working professionals, longer than I have who haven’t the faintest clue about Web standards, let alone validation.

Why should they care if their site validates? Why even listen, especially when people come at them with smug, know-it-all, little jabs? Or expectations (like 100% validation) that they’ll never realistically meet?

16
Sunny says:
June 09, 10h

Web standards is a process. Therefore, validation is a crucial part of it. Its important but not the absolute necessity.

The problem really is then all the links that we put on our sites inviting every passer-by to validate our code. If you cannot guarantee 100% valid code all the time (frankly its impossible except in highly controlled instances), it would be wise to not put these links. If it’s just to debug then do it client-side instead of putting it out there.

We wear it as a badge of honor and then flinch when somebody points out a problem. We can’t have it both ways.

June 09, 10h

Keith wrote:

“But for crying out loud, it’s that ‘smug sense of superiority’ that keeps people from wanting to bother with Web standards at all.”

Oh, c’mon! No it’s not.

What keeps people from wanting to bother with Web Standards is *Internet Explorer*. Knowing that their hard work would be largely wasted on 90-odd % of their audience is what keeps people away.

Saying “You gotta run your pages through the Validator and escape all those unescaped ampersands (and BTW, here’s some code to do it for you automatically).” is not what sends people screaming from the room.

Saying, “And then, even after you’ve done all that, without the following convoluted and fragile CSS hacks, your pages will *still* look like crap in IE,” *that* is what sends people screaming from the room.

You really won’t do a *thing* to change that reality by saying, “Relax, don’t worry whether your pages are valid (X)HTML. It doesn’t really matter.”

Ryan says:
June 09, 10h

The only time I click a validation link is when it is my own. I am not as concernd with it whereas I would sit there, everyday, clicking on the link every minute just to make sure some markup goblin didn’t get to my code and scatter it all about. However I do have a tendency to click it after adding entries, just to make sure I didn’t put in some silly charecter that messes up its validaity.

Besides that the only time I click it is when I am on a site that has the WC3 validator link and I want to go to the WC3 homepage. This is mostly a laziness thing, I have it down. Mouse in hand, click link, TAB, Enter.

June 10, 01h

> If unencoded ampersands on a Web page choked IE the whole damn Web would break. I think we have some time before validation a real world concern.

People have said that before, and a new browser version comes out and they have to scramble to fix everywhere they made the mistake.

http://www.htmlhelp.com/tools/validator/reasons.html

What if, for example, one of Google’s updates caused it to stop reading your pages because of one of the errors you think is unimportant because they work in mainstream browsers? Don’t say “Google would never do that”, as these things aren’t usually planned, and Netscape managed to do it quite a few times when it was the majority browser.

jacob says:
June 10, 02h

If I may ignore the larger discussion and follow the tangent on comment validation: If you have reason to believe that your visitors aren’t likely to know or care about HTML, but you want them to be able to format comments, consider using a simplified markup syntax like Textile or Markdown; your commenters benefit from the intuitive formatting syntax* that hews closely to that used in plain text email, and you get valid XHTML output for free.

Another solution, used here, is to disallow HTML entirely, of course. In many cases, that’s perfectly appropriate.


* - To be fair, the hyperlinking syntaxes used by Textile and Markdown are admittedly no easier to use or remember than HTML.

Mike D. says:
June 10, 04h

Jim,

Yes, I did read the link you included. For those who didn’t click it, here is its contents quite literally: “4 Reasons to Validate Your HTML… 1. Netscape 1.2… 2. Netscape 2.0… 3. Netscape 3.0… 4 Netscape 4.0”. Very insightful and useful stuff, isn’t it? Of all the good reasons to write good code, they come up with basically one reason instead: a now defunct company kept releasing browsers which broke stuff (valid and invalid). Somehow that counts as four reasons I guess.

The fact is, IE for the PC broke as much valid stuff as Netscape X.X broke invalid stuff. Wanna use the W3C float model? Well guess what? It’s going to break in IE PC. You can hack around it and fix it, just like you can fix any invalid code, but the point is that validating your HTML doesn’t buy you nearly as much as you think it does. It makes you a better coder, and it makes you generally more competent at what you do, which is all good, but this “sky is falling” mentality of standards purists has got to go. The sky is not falling, and as the collective skills of the web design/production industry get better, your concerns will take care of themselves.

As for Google, their manifesto is simple: be the largest, most relevant source of indexing on the web. They want to index the most pages and rank them in a way which helps users get to what they are after. They aren’t out to “promote valid pages” or shut out sites which make errors. And that is why they will always remain as tolerant as possible. They don’t care about code, only content. Yes, semantics are supposed to help out with page ranking, but right now, they don’t help as much as people think they do. Google got to where it’s at by using technology which relied on people, not tags. The general concept is, the more people who are linking to this page, the more important it must be. Not so much “the site with this keyword in the H1 tag is most relevant.”

Jacque,

You said: “If HTML validity doesn’t “matter?, then what *does*?”

This rhetorical question is exactly the problem that I have with the validation junkie mentality. Do you really think that *validation* is the point of the web? Oh my. I can think of 1000 things more important on the web than validation. The point of the web is to communicate, to entertain, to inform, to amuse. Validation is but a best practice in achieving these objectives. Not that the two are mutually exclusive at all, but I’d take a site which achieves all four of those objectives and contains 10,000 validation errors over a site than achieves none of the above and is valid XHTML strict. Let’s please not lose sight of the higher level objectives of the web here.

And yeah, it’s nice that Mr. Validation provided a link to one of your invalid pages. I’m sure there are plenty more. When we redesigned ESPN.com last year, I got into a pretty heated e-mail discussion with a certain unnamed purist involved with WaSP, only to run his entire site through the WDG validator (they let you validate entire sites… nice!). Almost every single page had validation errors on it.

Practice what you preach… that is all we ask.

June 10, 04h

Since most bloggers indicate that it’s their comments that tend not to validate, shouldn’t the blogging tools (such as WordPress and MovableType) provide XHTML validation and possibly prevent users from posting until their comment is valid?

For sites like these that don’t offer the ability to enter HTML, the tool should simply generate out of the box.

For those that argue that their CMS tool doesn’t generate valid code, maybe it’s time to either a) put pressure on these CMS companies to do so or b) switch tools.

Mark says:
June 10, 05h

A more recent example of “new browser, stricter rules, breaks noncompliant markup” is when IE 6 came out and centered text in everybody’s table-based layouts.

http://evolt.org/article/Does_IE_6_Center_Your_Table_Content/17/15341/

This affected several blogging systems whose default templates relied on the old lax behavior.

IE 6 SP 2 is just around the corner. Despite the version number, it’s a major upgrade. Will you get bitten again?

http://msdn.microsoft.com/asp.net/using/understanding/security/default.aspx?pull=/library/en-us/dnwxp/html/xpsp2websites.asp

June 10, 06h

Rather than beat up on Keith (not my intention, anyway), maybe I should point him to some resources for *fixing* his Validation problems.

Unescaped ampersands: http://www.estey.com/mt/index.cgi?SafeHref
“Garbage” (Windows-1252) characters: http://golem.ph.utexas.edu/~distler/blog/archives/000347.html
Validating comments: http://golem.ph.utexas.edu/~distler/blog/archives/000155.html
MT-Validator (the “Zeldman Edition”): http://golem.ph.utexas.edu/~distler/blog/archives/000370.html

There, how’s that for positivity?

June 10, 06h

“This rhetorical question is exactly the problem that I have with the validation junkie mentality. Do you really think that *validation* is the point of the web?”

Not of the the Web (nor of the internet, nor of life), but it is *surely* the point of Web Standards. I am *baffled* that anyone can “advocate Web Standards” and simultaneously hold that validation doesn’t matter. Maybe you mean something different by the term “Standards.”

“And yeah, it’s nice that Mr. Validation provided a link to one of your invalid pages.”

The page in question was one where I added a footnote and forgot to renumber the corresponding “id” attributes. Hence I had two <div>s with the same “id”. Changing a “2” to a “3” and rebuilding the page fixed the problem. It took longer for me to type this explanation than it took me to *fix* the problem.

“I’m sure there are plenty more. …
Practice what you preach … that is all we ask.”

Absolutely!

Find a page on my site that’s invalid, let me know, and I will promptly, and cheerfully *fix* it.

June 10, 07h

Making a web page valid is so ridiculously simple that I *don’t* understand how people can make it invalid and let it stay that way. Instead of complaining about other people’s constructive critique about validation, I think site owners should use the time to fix the problem.

Come on. Validation is simple. Really.

June 10, 08h

“I mean, my site isn’t 100% valid all the time (cough, comments)…”

Jacques pointed to a tool for validating Movable Type comments above. There’s also Simon Willison’s simple PHP solution:

http://simon.incutio.com/archive/2003/02/23/safeHtmlChecker

I, too, validate my comments. So does Anne van Kesteren. I’m sure there are others. There’s just no reason for comments to get in your way.

Put me squarely in the camp that wants to be notified when my site doesn’t validate. Even the most vitriolic comment is useful if it alerts me to a problem I may not have otherwise noticed for weeks. Silly me, I’m still flattered that people read my weblog, even when they’re assholes.

June 10, 08h

If I peer into the older archives of my website, I am sure I’ll find a few pages that don’t validate. Getting everything to validate as XHTML 1.1 is tough sometimes, especially in this world of reader comments.

One particular thing that has helped me, however, is serving the XHTML as application/xhtml+xml. You see, most validation problems I encounter revolve around well-formedness. By serving my pages with the application/xhtml+xml MIME type, any errors are instantly rewarded with a fatal error, thus notifying me that instant attention is required.

June 10, 08h

Simon Willison’s been doing it for a year, by the way.

http://simon.incutio.com/archive/2004/05/02/stayingValid

“Put me squarely in the camp that wants to be notified when my site doesn’t validate.”

I cringe! I don’t know about the standards police, but the grammar police are gonna’ get me. I don’t mean to imply that there’s an entire camp of people who want to be notified when my site fails. I want to be notified, though.

June 10, 08h

What is standards-compliance without validation?

Why say, “click here to see my validation” if you know your page is invalid?

It is fairly easy to start with what you know validates. Then you have less overhaul when it comes time to fix your invalid coding. We shouldn’t use XHTML Strict if we choose not to backslash every break . It would just make no sense.

True enough, there are times during testing and initial launches that we may forget or choose not to include valid tags and what-not. But just as habitual as it is to upload our files to our servers, it should become the same habit to click that little “Valid: XHTML | CSS” link most of us have on our pages.

I do not believe that people should run around harrassing people because their pages don’t validate though. Many times, especially in poplular blogs, it seems like the arrests are done out of envy. I can’t call them Standards Police, they seem more like Standards Vultures.. waiting for the big dogs to slip up so they can slide in and devour. Any message you send someone about validation should be constructive in content and should offer a solution. Don’t try to be a doctor if you can’t offer a cure.

Fernando says:
June 10, 08h

I think if we can get back on task, we can see that Brian and Dave’s points in their articles were:
(1) We don’t know who creates the invalid content, or what the designer/developer is even trying to accomplish when the markup was created.
(2) Quit sweating the small stuff. The designer/developer probably knows about and may be working on a solution behind the scenes.
(3) Provide a solution if you are so dedicated to standards-compliance. Quit the bashing and start writing to prevent it.

When your CMS or your client themselves have control over certain aspects of the site, it can be impossible or hard to handle every “Hey, you did this and that wrong.” I used to get mad, but now I just laugh when I visit a site I created only to find that the site owner loaded it with invalid coding that I tried to teach them not to use. They could care less though.

Maybe we all should implement one of those good ol “Report Broken Link” pages with a link right next to our validation links.. maybe even include a little notice or FAQ section to eliminate some of the annoying messages.

Keith says:
June 10, 08h

To all of you who just assume this stuff is easy, take a look beyond your own workspace.

I’ve been a vocal Web standards advocate long enough to realize that it’s not a simple to some as many of us (I’ve been guilty of this) make it out to be.

I’ve looked into these validation solutions and others like them. I’ve got a few implemented. (Unencoded ampersands are not a problem on my site, I just used that as an example.)

For example, I look at Simon’s solution, which I’m sure is great. I’m not all that familiar with PHP though and I’ve got no clue how to implement it. Same goes with a few of the links Jacques pointed out. By the way, thanks for the links, I may be able to use one of those, but when they start talking about fink, etc. I’m totally lost.

But, when you say things like “Making a web page valid is so ridiculously simple that I *don’t* understand how people can make it invalid and let it stay that way. Instead of complaining about other people’s constructive critique about validation, I think site owners should use the time to fix the problem. Come on. Validation is simple. Really.”

You are making huge assumptions based on very little knowledge. You obviously have no f’n clue how had it can be to keep a site valid. You’re using a default MT template – why not take the time design your own, it’s “ridiculously simple”??

And anyway the point of this whole post is that the critique about validation isn’t usually constructive.

I guess I’m an easy target because I’m not a hardcore developer. Hell, I’m just a Web designer. When this stuff is truly easy, then I’ll probably have a 100% valid site. For now, I’m content with being 97.5% valid – it’s as much or more than most can boast.

Mike D. says:
June 10, 09h

Keith, you are a good man, and I have no idea how you find the energy to respond so comprehensively to this nonsense. If I tried to list every ridiculous statement on this thread, I would probably never even make it to work today. But it’s 9:20am and I haven’t even showered yet, so I’ll just include the two most ridiculous ones:

1. “Making a web page valid is so ridiculously simple that I *don’t* understand how people can make it invalid and let it stay that way.”

Yes, making a web *page* that validates is ridiculously simple. You and your pressed-from-one-template blog can live together in peace and harmony knowing you are following someone else’s rules. As for me, I’d like to concentrate on actually making useful web sites… k? As Keith says, look outside your own fishbowl before making comments like this. Ever work at a 400,000 person company?

2. “What if, for example, one of Google’s updates caused it to stop reading your pages because of one of the errors you think is unimportant because they work in mainstream browsers? Don’t say “Google would never do that”.”

Sorry, but I’ll say it: “Google would never do that”. And neither would any company who gives a damn about making money. Microsoft saw to it very early that the web was a forgiving medium. As a programmer, you have to view this as bad. But as a user, unfortunately it is actually a good thing. If I’m sucking down a web page and my connection cuts out before the last closing div tag makes it through, IE and every other tolerant browser will likely paint the page anyway. Or if some $6 per hour shmoe forgets a tag here or there, again, I will still see the page. Tolerance is good for the user in an immediate sense, but the unfortunate side-effect is that code can get sloppy with no ill effects. If Netscape 4 ruled the world, ironically, sites might be more standards-compliant (did I just say that?). Netscape 4 didn’t paint anything that wasn’t properly closed.

A stricter web will be here eventually, but don’t expect companies like Google, or anyone else for that matter, to just stop reading your invalid stuff, either on purpose or on accident. We have time before HTML validity matters. Quite a bit of time. A stricter web will most likely come from other formats such as Atom, RSS, etc. These newer, less bastardized formats are less prone to the crappy practices we see in HTML because they serve a lot simpler purpose.

June 10, 10h

The only time I ever email website administrators is when they design their website for white background colour, yet never declare a white background colour.

June 10, 10h

> Sorry, but I’ll say it: “Google would never do that”. And neither would any company who gives a damn about making money.

I already pointed out that Netscape, in their prime, did exactly this. Didn’t you bother reading the link I provided?

You are making the assumption that millions and millions of pages will somehow not get indexed if Google manages to exclude your own site on the basis of a bug in your code. I’m not arguing that Google will suddenly require 100% valid HTML. I’m saying that it’s easy for Google to misinterpret broken code.

If only 0.01% of the web makes the same type of error you do, you’re saying that in the event the Googlebot doesn’t understand it the way you want it to, that Google would both realise this and care enough to fix it? Or would the 0.01% live in ignorance until they realised they’d disappeared from Google, and then rush to fix whatever they’d done wrong?

Or is your position that the types of errors you make are “safe” because they appear on enough websites (25%?) to make Google care enough to ensure they work around a particular bug in peoples HTML? If so, I’d suggest it’s a lot easier keeping track of what’s valid than how widespread the types of errors in your pages are.

June 10, 10h

Mike D. wrote:

“Sorry, but I’ll say it: ‘Google would never do that’. “

You are wrong.

There is invalid HTML that will cause Google to barf and not index your page (technically, they call it a “partially-indexed” page). Heck, there was a period (fixed now, thank G-d) when Google was failing to index perfectly *valid* XHTML pages.

“We have time before HTML validity matters.”

If HTML validity doesn’t “matter”, then what *does*?

OK, I admit it: Web Standards are a waste of time. Nobody should bother with them. Browsers are tolerant, Google is tolerant. Slap any old crap together, and if it looks OK in IE/Win, ship it! ;-)

Anne says:
June 10, 10h

Or you could write solid software that doesn’t output invalid markup.

38
Mr. Validation says:
June 10, 10h

Hey Jaques:

http://validator.w3.org/check?uri=http%3A%2F%2Fgolem.ph.utexas.edu%2F~distler%2Fblog%2F

How easy was that again? Heck, I only tried ONE of your “perfect pages.”

You are clearly part of the group of people who are making it hard for developers to grab on to standards. Because when they come across people like you they RUN the other way.

Keith is the sort of person who makes me want to use standards, because he realizes what goes on in the real world.

Cheers

Mike D. says:
June 10, 11h

Jacques:

Okay, coupla things:

1. I just clicked over to your site using arguably the most standards-compliant browser on earth, Safari 1.2, and this is the error message I get immediately when I hit the page:

“The page “Musings? has content of MIME type “image/svg+xml?. Because you don’t have a plug-in installed for this MIME type, this content can’t be displayed.”

This isn’t a little error sitting at the bottom of the page somewhere. Nor is it an error that only shows up on a validator. It is a browser-level error which causes me to click Safari’s OK box just to make it go away! I wasn’t expecting such great fodder for this particular conversation we’re having but oh my god! This exactly the sort of thing I’m talking about! Would you rather have a page which validates on the W3C validator or a page which doesn’t throw up errors in common browsers? Sure, both is the ideal, but this underscores my point as well as anything ever could:

Right now, if you are going to rank importances, “how things look in a browser” is a *lot* more important than “how things look in a validator”. It just is, and it will always be. Again, don’t think I’m saying validation is meaningless. I’m just saying the fact that you validate your pages doesn’t make up for the fact that you don’t test in common browsers. One is of paramount importance, the other is more of academic and theoretical importance at this point in time.

2. As I said before, I’m sure there are plenty of other pages on your site which don’t validate. It only took me two tries to find this jewel:

http://validator.w3.org/check?uri=http%3A%2F%2Fgolem.ph.utexas.edu%2F%7Edistler%2FMacStuff%2FEudoraNicknameIAD.html

No DOCTYPE even? But validation is so eaaaaaasy!

* Footnote: I am the last person to run people’s sites through a validator. Just trying to keep people accountable here.

3. Lastly, you said “[validation is] not the point of the Web (nor of the internet, nor of life), but it is *surely* the point of Web Standards.

Again I have to disagree. The point of speed limits is not to go 55. It is to make for a safe driving environment. The point of anti-trust laws is not to break up companies who have monopolies. It is to make for a competitive economic system with a level playing field. The point of web standards is not validation. It is to make for an efficient, accessible publishing environment.

June 10, 11h

Mr Validator wrote:

“Hey Jaques:
http://validator.w3.org/check?uri=http%3A%2F%2Fgolem.ph.utexas.edu%2F~distler%2Fblog%2F
How easy was that again? Heck, I only tried ONE of your ‘perfect pages.’”

Thanks. I really appreciate that.

Changed one character, clicked repost, and it was fixed. Less than 5 seconds, *total*.

If you find any more errors, I’d be more than happy to hear about them.

“You are clearly part of the group of people who are making it hard for developers to grab on to standards. Because when they come across people like you they RUN the other way.”

If anyone bases their decisions on what technologies to use in their work based on what *I* have to say, they have more serious problems than a few Validation Errors.

Brian G says:
June 10, 12h

Thanks to Dave for posting this, and to the MezzoBlue visitors for some excellent comments.

My personal frustration came because the site works in every single browser from Netscape 4.1 through the current versions of Mozilla. It doesn’t validate now, but two weeks before launch it did…then our primary stakeholder decided it had to work in all 4.0 browsers as well. It took a week of work to make it do that, but of course the pages no longer validated, and it was even MORE of a mess of tables.

My new version looks almost exactly the same, only it’s mostly table less (unfortunately sports stats etc will always be presented in tables). I’ve snuck little bits of the new design in here and there, and will hopefully have the whole thing up by the end of the summer.

The whole thing doesn’t end there though, the production team is a team of 3. We’re all pretty HTML savvy, but the actual team of editors who maintain the site is 5 times as many people as are working behind the scenes, to them “Valid HTML” isn’t even in their lexicon (but “Triple-Double” is ironically).

No matter how hard I try, this 10,000 + page dynamic, living, breathing site ain’t gonna stay valid for long.

June 11, 01h

Oh, and just to forestall your next missive about all the “garbage” on the page, Safari does not support embedded MathML (one of those newfangled web technologies that actually *requires* XHTML). To appreciate those sites, you’ll have to use either Mozilla, or IE/6 with the (free) MathPlayer 2.0 plugin.

(Lordy, lordy! You might actually have to install *two* plugins!)

RC Pierce says:
June 11, 02h

“”Volunteer to build a site for a non-profit group.”“

Hear! Hear! Were it not for volunteering to work on a couple of worthwhile projects, I doubt I would ever have learned what I know now (which still isn’t much).

One must admit, taking apart some poorly built sites was part of the method I’ve employed to ascertain what not to do, and I’ll further admit that this is not exactly a career move, but had I set out to aimlessly create my own web site, instead of helping to create others, mine would likely not be any further along than it is now and I sure wouldn’t have had this much fun.

Andy Budd says:
June 11, 02h

Personally I find it quite useful when people send me a polite email letting me know about a problem on my site. It’s not always possible to track down every bug and it shows that the people emailing you actually care about your site, which is nice. I try and email everybody back to thank them, but unfortunately that’s not always possible.

What I don’t appreciate is the (very) occasional email/comment that, rather then simply pointing out a deficiency in a friendly, helpful way, points something out in a rude and obnoxious way. It’s one thing trying to be helpful, it’s quite another pointing out somebody else’s deficiency to make yourself feel superior.

June 11, 03h

I wrote:

“Safari’s behaviour is a bug. Tell Dave Hyatt. He’ll fix it.”

Since, obviously, you won’t take *my* word for it, here’s the relevant Standard ( http://www.w3.org/TR/html401/struct/objects.html#h-13.3.1 ):

A user agent must interpret an OBJECT element according to the following precedence rules:

1. The user agent must first try to render the object. It should not render the element’s contents, but it must examine them in case the element contains any direct children that are PARAM elements (see object initialization) or MAP elements (see client-side image maps).
2. If the user agent is not able to render the object for whatever reason (configured not to, lack of resources, wrong architecture, etc.), it must try to render its contents.

June 11, 05h

I wrote:

“Is the following HTML fragment
<p> Shake <i> rattle <p> and </p> roll </i> </p>
a) valid HTML 4 (per the DTD) ?
b) Does it adhere to the HTML 4 standard?”

Dang! And I forgot the all-important punch-line:

c) Is it valid XHTML 1.0 (per the DTD)?
d) Does it adhere to the XHTML 1.0 standard?

The question isn’t very much fun without that (though you could have fun with <ins&> instead of <i>, even in HTML 4).

June 11, 06h

> For those who didn’t click it, here is its contents quite literally: “4 Reasons to Validate Your HTML… 1. Netscape 1.2… 2. Netscape 2.0… 3. Netscape 3.0… 4 Netscape 4.0…. Very insightful and useful stuff, isn’t it?

Mike, first of all that’s *not* its literal contents. Second of all, you really can’t see the trend?

1. People wrote invalid code that they thought didn’t break in anything. A new version of the leading browser was released, they were wrong.

2. People wrote invalid code that they thought didn’t break in anything. A new version of the leading browser was released, they were wrong.

3. People wrote invalid code that they thought didn’t break in anything. A new version of the leading browser was released, they were wrong.

4. People wrote invalid code that they thought didn’t break in anything. A new version of the leading browser was released, they were wrong.

The pattern is pretty obvious. Furthermore, in all those cases, it was the *leading web browser* that invalid code was breaking in. Not some obscure geek software.

It didn’t stop there though. Layers. Sliced image table layouts. Wrong MIME-types for stylesheets. Centring in tables (thanks Mark) Wrong MIME-type for Internet Explorer-only websites (forthcoming with XP SP2).

There is a history of invalid code breaking when new software comes out while its valid counterpart does not. The trend is continuing. If you can test in browsers that don’t exist yet, and make sure changes that Google engineers haven’t even thought of yet won’t break on your site, then by all means, do so and feel confident things won’t break.

I’d be interested to know if the people saying the specifications don’t matter have tested all their websites in Voyager 2 yet, and if not, what their basis is for expecting them to work.

June 11, 07h

Sort of an aside, but http://www.mezzoblue.com/rss/2.0/ and /rss/index.xml are not precisely invalid, but, likely to cause explosions and small fires. You’ve got sort-of nested CDATA sections, though not exactly nested since there are two start tags and only one end tag. I would say the validator’s correct that it isn’t invalid, because it’s only having a second end tag that makes for invalid XML, but the end result is that you are suggesting to feed readers that they should display the first paragraph of your entry, and then display an unclosed CDATA opening tag, which is… not exactly a best practice.

June 11, 08h

The answer to Anne’s question (#35) is that switching the encoding of this page to UTF-8 in the vain hope of avoiding Windows-1252 characters 0x80 thru 0x9F (which are control characters in ISO-8859-1, but which are non-existent byte sequences in UTF-8) only makes matters worse.

Instead of listing errors for “illegal” characters, the Validator coughs up a hairball and refuses to parse the page.

The problem is compounded by the fact that Dave’s Comment-Preview page (which Dave forces you to go through) defaults to ISO-8859-1, which means that not only will all the Windows-1252 characters already in the database *continue* to cause problems, but commenters will continue to submit *new* ones when they POST their comments.

The solution, of course, is still
http://golem.ph.utexas.edu/~distler/blog/archives/000347.html
which will remap those illegal (or nonexistent, depending on your encoding) byte sequences into what the commenters intended.

Mike D. says:
June 11, 09h

Okay, one more reply and I’m done here. Religious wars are best left to the religious and the militant…

1. Jacques: Ok, so I guess validation only matters to you if the pages fallen within your blog. I’m not going to go through all however-many-pages of it, as that’s a task better left to you. I’m just pointing out that after you made comment #1 on this thread (pretty funny in retrospect now), someone showed you an invalid page on your site. Then you said there were no more, and after two clicks I showed you another. You said this page was too to validate so since invalid pages such a crime, why not take it down and spare the world from the harm it might cause?

2. Regarding the MIME-type issue. Throwing up a browser-level error is *never* a good thing. Ever. That’s why we have javasript to detect for these things. Use JS to detect for the plug-in and show the content if the visitior has it. This sort of defensive design is a lot more important than any output the validator gives you. I’m not sure zoomable math equations are worth ever throwing up an error like that, but I am clearly not in your target audience so I can’t make that call.

3. Jim: I don’t want to beat this issue into the ground because I think we both agree with the general premise that “validation is a good thing”. The only thing we disagree with is on the strict necessity of it at this point in time. I like to drive 65 in a 55. You like to drive 55 in a 55. Yours will get you less tickets in the long run. Mine will get me where I need to go faster (and keep other drivers from honking at me… :) ).

June 11, 09h

> I like to drive 65 in a 55. You like to drive 55 in a 55.

Given that I have just had a discussion of this *exact* nature, I’ll have to agree with your point here :).

June 11, 12h

“The page ‘Musings’ has content of MIME type ‘image/svg+xml’. Because you don’t have a plug-in installed for this MIME type, this content can’t be displayed.”

Umh. That differs *how* from the run-of-the-mill Flash site?

I’ll tell you how. Once you dismiss the dialog box, you can actually *see* the illustration in Safari, because I’ve provided a fallback GIF image (this works in IE/Win, too, despite its notorious lack of support for the <object> element).

The only thing you lose by not having Adobe’s *free* plugin installed, is the ability to have the illustration rescale when you zoom the text. (Something to do with accessibility…)

If you’re interested in the technique, see
http://golem.ph.utexas.edu/~distler/blog/archives/000216.html

“It only took me two tries to find this jewel:
http://validator.w3.org/check?uri=http%3A%2F%2Fgolem.ph.utexas.edu%2F%7Edistler%2FMacStuff%2FEudoraNicknameIAD.html
No DOCTYPE even? But validation is so eaaaaaasy!”

Gee, a page written in 1997 to support a piece of software for MacOS 8, an OS that has not run on any Macintosh produced in the last G-d knows how many years. This, in case you are too young to remember, was *long* before the W3C Validator even existed. In fact, it followed shortly on the heels of the first published W3C HTML standards (not that I, or you or anyone else had ever heard of them in 1997).

I *could* make that page go 404. Would that make you happier?

If you want to be useful, confine your efforts to those pages which have actually been updated in the past 5 or 6 years. More specifically, pages in the directories

http://golem.ph.utexas.edu/~distler/blog/
http://golem.ph.utexas.edu/string/

That’s over 420 pages for you to choose from (not including CGI-generated pages, like comment-entry forms and the like – which also ought to validate).

“The point of web standards is not validation.”

Maybe we have a semantic problem here. Maybe you need to explain what you mean by “Standards.” I thought Standards are those pesky documents published by the W3C, the IETF and other “Standards Bodies.”

Indeed, it’s true that “validating against a DTD” is not the same thing as “adherence to Standards.” Since one might as well salvage some useful content from this discussion which has clearly gone off the rails, I pose to you the following question:

Is the following HTML fragment
<p> Shake <i> rattle <p> and </p> roll </i> </p>
a) valid HTML 4 (per the DTD) ?
b) Does it adhere to the HTML 4 standard?

Look, I *really* don’t *care* whether Keith’s or your or anyone else’s site but my own validates. I *do* care when Keith and various others in this thread say, “Unecoded ampersands in XHTML? No big deal. Nothing will break.”

That is really, really, REALLY bad advice. It vitiates whatever (probably marginal) utility there ever was in using XHTML in the first place.

June 11, 12h

Mike D. wrote:

“1. Jacques: Ok, so I guess validation only matters to you if the pages fallen within your blog. I’m not going to go through all however-many-pages of it, as that’s a task better left to you.”

*Someone* just *happened* to let loose the WDG bulk Validator on my blog at 11:08 CDT this morning. I guess it must have been someone *else*. And I surmise that they didn’t find any errors, as I’m *sure* we would have heard about them if they had.

But thanks for playing.

“Someone showed you an invalid page on your site. Then you said there were no more, and after two clicks I showed you another.”

By “sites”, I meant my blog, and the String Coffee Table at the URL’s listed above. Only these are generated by a CMS (the *same* CMS Keith uses), with software in place to give a reasonable assurance that they are valid.

I did not mean random hand-authored pages from October 1997 (long before there even existed online tools to validate web pages) in an unnamed dialect of HTML (because when I wrote the page, I did not even know there *were* formally-specified dialects of HTML).

Sorry if the context wasn’t sufficiently clear.

And, anyway, the statement was *not* that I have never authored an invalid page of HTML. Rather, that I was willing to *fix* any errors that come to light.

Shall I add a DTD to that page from 1997? What do you suggest?

“Regarding the MIME-type issue. Throwing up a browser-level error is *never* a good thing. Ever. That’s why we have javasript to detect for these things. Use JS to detect for the plug-in and show the content if the visitior has it.”

I think you are utterly confused. The dialog box you saw is *no different* from the dialog box you would see if you visit a Flash site without the Macromedia plugin installed.

Except for 2 things:

1) Safari is being overly cautious here. Despite being no different from what it does when you visit any “multimedia” site without the appropriate plugin installed, Safari’s behaviour is *incorrect.* The whole *point* of the syntax of the <object> element is to provide seamless fallback to alternate content, *without* resorting to stupid javascript hacks. If you visit the page in Mozilla, or IE/Win, without the Adobe plugin installed, they will simply and silently display the alternate (GIF) image instead. That is the *correct* behaviour (for once, IE/Win does something right). Safari’s behaviour is a bug. Tell Dave Hyatt. He’ll fix it.
2) Unlike the Flash site, I do offer alternative content, which is almost as serviceable as the SVG content you don’t have a plugin for.

“I’m not sure zoomable math equations are worth ever throwing up an error like that, but I am clearly not in your target audience so I can’t make that call.”

SVG is for zoomable illustrations. Safari just renders gibberish where the MathML equations (which are also zoomable) should be. It does that *silently*, which, unfortunately, is the *correct* behaviour, according to the Standards.

And, no, you’re not in my target audience. (At least we can agree on *something*.)

Keith says:
June 12, 01h

Flame warning. I was going to e-mail this directly to Jacques, but I can’t find his e-mail address or any contact info at his site.

Jacques – you know man, I was content to sit the rest of this out and let you just go on digging yourself deeper and deeper. I mean, I hate to say it brother but you are way, way off base with much of what you say and I think you need to be put in your place a bit.

I’m sure you’re a nice person, but you’re coming off like an ass and for me anyway it’s drowning any positive message you may have. Sorry.

When you say things like:

“Maybe 95%, or 75%, or 35% is “good enough”? for you. Fine. Whatever makes you happy. But your personal site is almost certainly the upper-limit on what you’re ever going to achieve in the “real world”?.”

You really show yourself to be blind to the issues many of us face and frankly clueless to any situation not your own.

First off, it’s hard to take what you say seriously when I go to your site, hit a page with the validator and it doesn’t validate. You say it’s important but then you don’t back it up. Makes much of what you say ring a bit false. And you 1997 argument isn’t a strong one considering all the things you’ve been saying.

Second. Our personal sites are the upper-limit? That sounds like an attack and really, you need to think about who you are talking to before you spit that crap out.

Umm, let’s see. Mike Davidson - ESPN. Andrei - Adobe. Myself - Boeing, Microsoft, Sony. I’ve also turned down jobs from Real Networks, Macromedia, AT&T and AOL. To name a few.

Give me an f’n break man. I make my living designing Web sites and have for years! What the hell are you talking about?!?

It starts with goals. I’ve yet to work on a project, in my 10 years as a Web professional, where validation or pleasing the W3C validatior was central to meeting a single goal laid out for a given project. Not a one! I don’t expect this to change any time soon and if it does – I can pull a Zeldman and delete and disable comments on my site and my “personal” site will be 100% valid.

Who the hell are you to tell me, or anyone else for that matter, what I can and can’t achieve in the “real world?”

Validation might be central to your goals. I’ve got no problem with that whatsoever, but to presume to know what we need to do to get our jobs done (especially after it’s been mentioned time and time again that validation isn’t one of those things – do you listen?) or have the balls to judge our capacity to achieve those goals, which is exactly what you are doing with stupid comments like the above, is just a joke.

I’ll say it again, it’s people like you, with an unwarranted sense of superiority, that make it very hard for those of us who care about this stuff to keep our heads and hearts in the game.

I run my site for free. I work my ass off to provide fresh content and good advice. I never once promise to keep that damn thing valid, nor do I say anywhere that I find it necessary. Nor, for that matter, do I every suggest that someone shouldn’t try and strive for a valid site. I don’t think that’s good advice so I wouldn’t say that.

What I do say, and firmly believe, it that there are more important things to deal with than validation.

I do think it’s important, I really, honestly do – but there is much, much more to a successful Web site than validation. Period. End of story.

The whole point of this post, as I read it anyway, was the attitude with which people contribute feedback about validation. If you want to be helpful – please do that. No one would argue that a helpful e-mail or comment is a good thing.

It’s the “smug sense of superiority” that gets people all riled up. It also happens to be how I get almost all of my email about validation errors. I got one today for crying out loud. Someone telling me they wouldn’t visit my site until I fixed an error!

(I told him good riddance)

Are you advocating that? Because it kind of seems like you are.

To get respect you need to give respect. When someone attacks me for an error and calls me a hypocrite or accuses me of not practicing what I preach, any message they originally had gets lost.

Give respectful feedback and you’ll get a respectful thank you back. Just ask anyone who’s written respectfully to me. Be a pedant and you’ll get deleted or told to shove off.

June 12, 04h

Jacques is making the same mistake he made with me a few months back. The bottom line is not validation or that’s its possible after jumping through some very big hoosp for a non-programmer. It’s the comment Anne made. Until the tools are in place to produce valid XHTML markup, don’t expect Web Standards to truly have the impact they should. (And yes, my own company needs to heed that bit of advice as well.) Until the tools are there, Keith and I will be content with 97% validation, given the non-valid stuff comes from people entering comments on our blogs.

Don’t tell me as a conent creator, developer and designer to have to code to fix the issue. Don’t give me a bunch of links that tell me how from a UNIX hacker’s perspective the means to fix ampersands, and invalid comments.

All of that is bullsh*t.

Creating a 100% valid XHTML 1.0 Strict web site is hard work to get that last 2% to 3% right. As we all know, if it’s not 100%, it’s pointless serving the right MIME type. Doing so will kill you without that final percent in place.

Join the real world Jacques. Stop berating the guys, like Keith and myself, who can make it happen if given the proper tools.

June 12, 06h

Andrei,

I said it before, and I’ll say it again: I don’t give a *damn* whether your or Keith’s or anyone else’s site but my own validates. But I *have* written the tools to help make it happen.

Achieving validation with a crappy CMS may be impossible. Achieving validation with a hand-rolled site may be hard, but doable. Achieving it with a good CMS, with the right tools in place is *easy*. So easy, the String Coffee Table people do it without even knowing it.

If *you* care about Web Standards, then stop whining (http://www.designbyfire.com/000085.html ) about how hard it is getting your MT-driven site to validate, install the tools, and be done with the matter.

Do you want someone else to installl the tools *for you*? For a fee, I’m sure that, too, can be arranged.

But, whatever you choose, don’t whine about the consequences of your choices.

June 12, 09h

That comment was rather intemperate, so, having had my morning coffee and reade the newspaper (always a good reality-check) let me try again.

Andrei is exasperated that, while SixApart have done 95% of what is required to make his personal site generate standards-compliant code, the last 5% is left up to the end-user.

That last 5% can either be done *by hand* (tedious as hell, and Keith and others will decide it’s not worth doing), or it, too, can be automated.

Automating anything *the first time* requires programming. But there’s no requirement that everyone reinvent the wheel for himself. You *can* benefit from the experience of others.

But why even bother? I mean, who cares?

I think the obvious answer is that some of the knowledge gained, by using your personal site as a laboratory, might actually be transferable to your other work.

Most CMS’s won’t even get you 95% of the way to valid code. Getting anything *close* to valid code on one of your commercial sites would be an utterly hopeless task, if you can’t tease valid code out of MovableType.

Maybe 95%, or 75%, or 35% is “good enough” for you. Fine. Whatever makes you happy. But your personal site is almost certainly the upper-limit on what you’re ever going to achieve in the “real world”.

Personally, I happen to *need* a higher level of Standards-compliance that your average Joe. I *have* to send my pages as application/xhtml+xml, or the MathML equations *won’t render* (in either Mozilla or in EI/Win with MathPlayer). A single stray unencoded ampersand will *knock my site off-line* for the majority of my users.

95% won’t cut it in that situation. I need 100%. So all the stuff Andrei’s “struggling” with (and Keith’s given up on) — I struggled with a year and a half ago.

Maybe you won’t learn as much installing some pre-written MT plugins and re-jiggering your MT templates. But you *will* solve your validation problems.

June 13, 03h

slightly off topic, but this article also applies to the area of accessibility. all too often, during some good discussion on the principles of making a site accessible, i find those annoying “yeah? so why does your site fail Bobby?” calls (particularly when somebody is proposing/defending a controversial line of thought). these are particularly annoying because then any conversation veers into ripping Bobby itself apart, pointing out that it is not the be all and end all of accessibility…

June 13, 12h

“Most CMS’s won’t even get you 95% of the way to valid code. Getting anything *close* to valid code on one of your commercial sites would be an utterly hopeless task, if you can’t tease valid code out of MovableType.”

This is simply incorrect.

Getting valid XHTML + CSS out of MoveableType is a no-brainer if you code your templates to standards. That’s a cakewalk. It’s getting people to enter well-formed markup into the comments area that is killing validation. If you ran a commerical blog that didn’t bother allowing people to add comments, then getting 100% validation is easy with any CMS tool.

That MT doesn’t have good error checking and parsing on comments as a built-in feature that doesn’t require a plug-in is the issue, and the sole issue for CMS software. Why they haven’t done it yet is anyone’s guess. Maybe they feel they have bigger features to fry.

Outside of that, I completely agree with Keith.

Kris says:
June 14, 03h

What the hell with the negative vibes in here? When I take the time and effort to e-mail somebody about their site, it is because I see a reason to come back. I *like* the site.

It seems to me some people here need to think about the ways they deal with critique and work-related stress.

62
NateL says:
June 15, 09h

I appreciate what has been said in this conversation. I happen to side with Keith, who in my estimation, has a much more realistic view of the web and workflow.

On the other hand, I find Jacques to be conceited and contradictory. In his own words, someone seeking “to reinforce a smug sense of superiority.” (comment 1) No doubt. From reading the ‘error’ messages on his site to his comments in here, that is clearly the case.

Jacques, to answer your question (posed in comment 51), that goes as follows (with some context included):

“And, anyway, the statement was *not* that I have never authored an invalid page of HTML. Rather, that I was willing to *fix* any errors that come to light. Shall I add a DTD to that page from 1997? What do you suggest?”

Someone actually spoke to this issue in comment 9. They said the following:

“Producing valid content is 95% having a good workflow in place. It has very little to do with repairing stuff after the fact… A web designer who promises Standards-compliant work ought to have a workflow to producing Standards-compliant output. If their own site doesn’t validate, what does that say about the work they do for clients?”

Any guesses on who that person was? It was you!

Let me highlight a few of my favorite parts for you:

“Rather, that I was willing to *fix* any errors that come to light. Shall I add a DTD to that page from 1997?” - Is it just me or did you just answer your own question?

“If their own *SITE* doesn’t validate, what does that say about the work they do for clients?” (emphasis mine). So is that page on your SITE? I think so. What does that say about the work you do for clients? It sounds like you need to get a little ‘commensuration’ on. After all, it can all be handled programatically, right? ;)

All of that to say, Jacques, in your case, the messenger very much got in the way of the message. It’s a shame, as I think this could have been a much more valuable discussion without your cocky, antagonistic rantings.

But that’s just my $0.02.

Dave S. says:
June 15, 09h

I’m all for healthy debate but this thread went anemic long ago. The finger pointing is getting ridiculous; I should have shut it down last week. Let’s correct that now.