Posts Tagged ‘XHTML’

ASP.NET and XHTML Validation

If you’ve ever created an XHTML 1.0 Strict page containing an ASP.NET form element and ran it through the W3 Validator, you’ve undoubtedly noticed it’s reported as being invalid no matter what you do and no matter how valid the code actually appears. This is because ASP.NET adjusts the way it renders markup according to the requesting user agent. ASP.NET pities the W3 Validator and sends it bad code. This can be fixed with a “browser” file. The file and instructions on its use are available from that page.

However, that’s not all. The validator will now see your pages the way you see them in your browser, but ASP.NET is still rendering an invalid name attribute on your form element! You need to add a line to the system.web section of your web.config file:

<xhtmlConformance mode="Strict" />

Now ASP.NET plays nice with the W3 Validator and renders a valid XHTML Strict form! Now you can stop using the XHTML Transitional doctype and start using the XHTML Strict doctype on your ASP.NET pages!

—Kyle Blizzard

The Target Attribute and Strict XHTML

So, you’ve decided to start creating your web sites with valid strict XHTML 1.0 and CSS. Your client wants a “links” page containing none other than links to other web sites. So you oblige and, to keep your client’s visitors on his site, you make the links open in a new window. So you throw in target="blank" and you’re done. Just run it through the W3 validator…

There is no attribute “target”? What gives?!

Yes, indeed, target is not a valid attribute in XHTML 1.0 Strict. There’s really only one way around it of which I’m aware, and that’s by using Javascript. My preferred method is as follows:

<a href="http://www.blizzarddigital.com/blog/" onclick="window.open(this.href, 'OffSite').focus(); return false;">

This opens the URL in a new window and brings it into focus if the user had previously opened an “OffSite” window and didn’t close it.

You would still put the desired URL in the href attribute just like with any hyperlink. That way, if the visitor for some reason has Javascript disabled, the link still functions correctly. It just wouldn’t open in a new window.

One caveat though—in Firefox 2 and later with default settings, this code causes the linked site to open in a new tab, not a new window. If you find this to be a problem, there is a way around it, and that’s by adding the window options parameter such as:

<a href="http://www.blizzarddigital.com/blog/" onclick="window.open(this.href, 'OffSite', 'directories=yes,location=yes,menubar=yes,resizable=yes,scrollbars=yes,toolbar=yes').focus(); return false;">

Those are all the options required to make the window appear normally with default tool and menu bars—kind of a pain. At this point, you may want to move the code into an external Javascript file. The function I use typically looks like the following:

function OpenOffSite(a)
{
window.open(a.href, "OffSite", "directories=yes,location=yes,menubar=yes,resizable=yes,scrollbars=yes,toolbar=yes").focus();
return false;
}

Then the onclick attribute of your anchor would look like so:

<a href="http://www.blizzarddigital.com/blog/" onclick="return OpenOffSite(this);">

Your links now open in a new window! Welcome to the world of XHTML conformity. 🙂

—Kyle Blizzard

Logical Structures and Happy Designers

While valid markup and CSS are necessary, they’re not the only things we web designers require to keep our sanity. One can create a valid page that is still difficult to work on. For example, using absolutely positioned elements for page layout is even worse than just using tables. When you add content to one element, the ones below it are not pushed down because they are no longer in the “flow” of the page. This causes overlap of content and makes it extremely difficult and frustrating to add anything to the page. Utilities such as Yahoo! SiteBuilder create pages like this.

A markup structure that is built to logically represent the content of the page and not to facilitate the appearance of the page is much more easily edited and future-proof than the “HTML soup” sites of old. I wistfully say “old”, but the truth of the matter seems to be that HTML soup is still the preferred meal of the majority of so-called web designers. A lot of the web sites we’ve moved to our servers are a pain to work with and rarely validate, and they were, sadly, created by professional web designers!

Another couple things that are bad practice but not invalid are the use of inline style and CSS classes with names that tie them to a particular stylesheet such as “red-text” or “left-side”. What if you change the stylesheet some day and that “red-text” is now supposed to be blue or the “left-side” spans the bottom of the page? The World Wide Web Consortium themselves discourage the naming of CSS classes like this. They also have some other useful tips that all web designers should read and keep in mind.

If you’re a fledgling web designer or perhaps someone who is just looking into having a web site created and wondering what valid code and good design practices can do for you, check out the CSS Zen Garden site, an excellent example that is logically structured and able to accept stylesheets of wildly different appearances without the need for markup changes.

—Kyle Blizzard

Valid code and happy designers

Writing valid XHTML and CSS takes a little extra time but can save a client money and a designer their sanity. In the relatively brief history of web design there has been a great deal of controversy concerning web standards. Designers like to make money and making money means completing projects quickly. Typically, if the client liked it and it rendered correctly then the project is done. The major browser manufacturers carry some of the blame in their efforts to have the latest and greatest features they have implemented web page design elements in their own way. Rather than engendering innovation they have had the effect of creating a Tower of Babel of incompatible tags. In the old days,  some designers used any hack necessary to get a page to display properly in the the most popular web browsers of the time. However, in the past several years, the web standards community has been increasingly vocal about incompatible code and rendering problems. This has been especially true with IE-only sites that will not render properly or at all in alternative browsers such as Firefox, Safari, Opera, and Chrome. However, a standards compliant site provides greater interoperability for the same content on different platforms.

Another additional benefit of writing valid code is that it is easier to read, edit, and redesign. Separating the presentation of a site from its content using stylesheets seems like a no brainer. Theoretically you could redesign an entire site just by editing the stylesheet and the graphics without ever touching the XHTML. We have a number of clients who move to us from other hosts and the amount of spaghetti code that we are asked to edit sometimes makes us want to pull our hair out.

The point is, you can write sloppy code if you want to. You could even use <blink> tags in a site designed entirely in tables and transparent gifs if you were so inclined. But where is the return on investment for the client? There is a high probability that you are not the last one that will ever touch their website. So have a heart and validate your XHTML and CSS. You will get faster at making lean, mean standards compliant sites that look good on any browser.

—Alan