Skip to: Navigation | Content | Sidebar | Footer

Full Archives

Notes From All Over Part IV

January 27
sxsw web awards

The 2004 SXSW awards finalists were announced this morning. The Zen Garden is up for 'Developer's Resource', with stiff competition from D. Keith Robinson's Asterisk, James Craig's Accessibility Internet Rally, Ian Lloyd et. al's Tools and Wizards at Accessify, and a new one to me, Jim Armstrong's 3D News and Tips.

Good luck to everyone nominated, and a huge, giant thanks to all who have participated in the Zen Garden. Be it with a design, translation, or even just good old link lovin', the Zen Garden wouldn't be what it is without everyone else's participation. This nomination is for all of us. §

Matt Mullenweg writes with word of a WordPress CSS competition . Like a Zen Garden for WordPress templates, submit your CSS-based design for a shot at three growing cash prizes. Better get a move on it, the competition closes February 6th. § I've posted this twice elsewhere , so I may as well get it on here too: I used to think [that you should develop first in IE, then test in Mozilla and the rest], but there's a good reason why my mindset shifted. If you develop in IE, what happens is that you create dependencies on its buggy rendering. Since the other browsers don't use the same flawed rendering, you'll have to bend over backward to hack your layout into working properly with most of them. In fact, when I was developing in IE, I used to curse Mozilla quite frequently for rendering my code 'wrong'. My experience has been that if you start out by developing in Mozilla or Safari, and then test in everything else afterward, you have to do much less hacking to make it work. The fringe browsers benefit; IE5/Mac gets a surprising number of my layouts right without any extra effort. Opera generally cooperates, although it can be a crapshoot at times. IE (per the general tone of this thread) is the big problem. Can you get away with developing in IE? Of course. Is it easier? Generally not. If you don't care about the fringe browsers, then you'll get away with it. If you do care, then you'll have a far nicer time developing in Mozilla. § Between the plethora of spam, Nigerian scams, and new virii going around these days, whitelists are starting to look mighty attractive. Ten times the regular volume of email came in overnight, thanks to the newest plague, MyDoom . In a timely manner, Gates is calling for the end of spam by 2006 . Unrelated, this is the same Gates whose company spent a full month cleaning house two years back, to squash security flaws in their products. But are they really unrelated? Spam is a social problem, which somehow seems resistant to technological fixes. Email-based virii are a technical problem, which spread across social networks. If you manage to surpress one, have you surpressed the other? The postage stamp idea sounds promising when applied to spam. When your email address is spoofed by the latest virus on thousands of messages beyond your control when you haven't done anything to initiate it , who's going to pay the bill? § After being stung by theft recently, Josh Williams shares his thoughts on protecting your interests while working in a creative field. A logical extension of Jeffrey Zeldman's recent advice about designing on spec , Josh's tips are worth their weight in gold (you know, metaphorically speaking). § ]]>

Permalink › | no comments

Friday Challenge

January 23

Whether it's a deceptively simple problem or a case of being too close to the code to see the easy answer, I've been struggling with this one problem on and off for months now:

Is it possible to use floats to position a fixed-width sidebar on the right of a page, with a liquid content area, if the content comes before the sidebar in the markup?

Floating, and not absolute positioning is necessary for the sake of a clearing footer. It's easier to see what I'm after by looking at the code:

<div id="content">

<div id="sidebar">

<div id="footer">

Corresponding CSS:

#content {
 float: left;
 margin-right: 210px;
#sidebar {
 width: 200px;
 float: right;
#footer {
 clear: both;

Simple, right? Should work? Well, check the test page. If the order of the content and sidebar divs is reversed (and the float is switched accordingly), then it works like a charm.

But why should the sidebar need to go before the content in the code order? "Skip nav" links can route around this if absolutely necessary, but CSS is about freeing presentation from content after all.

Note that floats are reliant on code order, and do work best with defined widths, but with the proper CSS you should be able to get a simple two-column layout working. Shouldn't you?

I can't. And I'm hoping there's a ridiculously simple solution to this, and I'll look silly for even posting it. But since I keep running into the wall, here's what I'm going to do.

First person to reply with a solution that gets it using just floats, without changing the order in the test markup (that is, content first, sidebar second, footer last) gets a copy of the upcoming second edition of CSS: Separating Content from Presentation which I contributed to, whenever it's published.

Again—1) no changing the order of the code (although adding new divs would be fine), and 2) no using absolute positioning unless you can somehow make it work with the footer. Other than that, the sky's the limit.

Update: Well that was quick. Ryan Brill takes the prize, with his creative negative margin solution. It tests fine in Safari, Firebird, Konqueror, all versions of IE 5+ (Win & Mac), Opera 6+, and even kinda-sorta works in NN4. Great job Ryan!

Permalink › | 68 comments

Digital Web Interview

January 22

A new Digital Web Interview with yours truly is now live, conducted by Craig Saila. Plenty of thoughts await on time management, design influences, WaSP efforts, and more on that Garden thing. And so as not to leave anyone hanging, I briefly tell the story I've always wanted to.

Comments open. Feel free to continue the interview here, if you've got any further questions.

Permalink › | 17 comments

Business Loss

January 21

Call me cruel (just call me!), but if the dotcom bust and the general recession mean that a 22-year-old can no longer collect eighty large for Instant Messaging his drinking buddies all day long, I don't consider that a profound national tragedy. — Jeffrey Zeldman for ALA, Jan. 2002

The dot-com boom, and bust, in a nutshell. Today we're back to the dollars and the cents, the business goals weighed against financial decisions, and the cold reality of making money.

In my neighbourhood, as in most, a selection of offices/stores/restaurants just can't seem to retain a clientelle. Dubbed 'black hole businesses' by some pundit along the way, these buildings cycle through business after business quicker than a snake sheds its skin. What was once an upscale tapas bar became an upscale seafood restaurant became an Indian grill is now a hair salon.

More profoundly sad, and perhaps a shifting indicator of the priorities of our times, are the two neighbourhood shops which closed up recently. One, a flower shop, the other a butcher's. Both established 1903. Both closed 2003. Exactly one hundred years in business, only to vanish in the night and greet the morning rush with newspaper-lined windows.

Much can be made of business strategy, economy, and marketability. But in the end you either win or lose. And while loss in this case may be quantifiably trivial, it certainly speaks volumes for quality.

Permalink › | no comments

Notes From All Over Part III

January 19

Fresh Monday links to enjoy while sipping your morning java. On CSS Hacks, XHTML Validation errors, Haughey and Zeldman under the iron, logo trends, security holes, anything-but-IE, video ads, ad agencies vs. the web, 404's, the Bloggies, and HTML rendering.

Permalink › | no comments

Type: the Extra Mile

January 18

A few typographic terms before we get started. Kerning is the process of optically adjusting the spacing between letters, by hand in most cases, to produce a more even end result. Hinting is the adjustment of type outlines, a willful distortion of the letterforms to help them fit the pixel grid. The difference is that you will most likely never need to hint (it's done by the designer/foundry before font distribution) but kerning is an essential monkeywrench for your toolbox. And finally, anti-aliasing is the softening of jagged pixellated curves through interpolation.

Kerning example

The two lines of text pictured above show the difference between type set simply by entering and forgetting about it (top), and type that has been manually kerned (bottom). As digital type has matured, the defaults are getting rather good. Hinting of the more common professional typefaces is generally done well, but manual work by the designer is often still needed to take it the extra mile.

However, when you hit a certain size with your type, it becomes hard to apply traditional kerning reliably. 12 pixel high type rarely benefits, unless the gaps are too obvious to ignore.

This is because when working with small enough sizes, a whole new set of problems is introduced. The algorithms used to generate type hit the pixel barrier: no stroke or curve can be reduced smaller than a single pixel in width. Anti-aliasing offers a method around this, by half-toning two or three pixels instead of setting one to full intensity. The optical effect is a smoother curve, which will often appear thinner.

Blurred text

But anti-aliasing is problematic, as the extra pixels add blur. And since it's an all-or-nothing proposition, you will find them blurring in spots that you really would rather they didn't.

Photoshop Dialogue—kerning

In the above image, you'll notice the stems (vertical strokes on either side) of the U are two grey pixels thick, instead of a single black pixel. The type display algorithm has essentially decided the center line for each stroke falls between the two pixels, so each is shaded. It's hard for the software, but easy for us to see that this results in a blurry U. Fortunately the tools exist that allow us to fix it.

The Photoshop dialogue at right highlights the adjustment tool you'll be looking for. By specifying a positive or negative integer, you control how much space displays between each letter. This is the same spot you'd go to kern. Note that your cursor has to be placed between two letters in order to work, without either highlighted.

For the purposes of our aliasing however, we're not so much concerned about the letter spacing as we are of where the type falls on the pixel grid. By playing with the values, it's possible to bump each letter off whatever axis is causing it to render sloppy, and tighten up the letters.

The dropdown shown above has some default values you can choose from, but often times you'll want to add a number in between those, which you have to type. That gets tedious. I just discovered this week that by hitting Option + (left or right arrow) [Mac] or ALT + (left or right arrow) [Win], you can adjust the spacing in increments of 20, which in most cases is good enough.

Blurry textSharper text

Compare and contrast the above images. Subtle changes can make a world of difference to final legibility.

Permalink › | no comments

Press Photos

January 17

Because it's getting to the point where I'm beginning to need these, April and I took a walk today armed with my Canon and grabbed a bunch of vanity shots for miscellaneous press purposes. A big nod to Anil Dash for inspiring me to get outside for this.


A bit squinty since I was looking into the sun, but it'll do. Thanks to Vancouver's return to normal temperatures, we caught a bit of snow on the hills without having to wade through it on the ground. We took this on the Granville Street Bridge, if you're curious. There's a much higher-resolution version kicking around elsewhere on the site if you know where to look, but I'd rather not link a 600k download off this page. Use your head, you'll find it if you're really interested.

The rules of press photos, according to someone who knows nothing about them:

  • Be creative. A headshot against a wall is as vanilla as they get.
  • Be spontaneous. Take a bunch of shots in various locations. Get any pose/location you think might work. Try a few variations.
  • Keep shooting. We spent ten minutes on the bridge and have about 30 shots to choose from, which was narrowed down to four, which finally settled onto one. Even this ain't perfect, but 30 is enough.
  • Plan ahead. If you think you might need a press photo in the future, chances are you'll need it sooner than you expect. I wish I had this one a few months ago.
  • Keep it fresh. You're a person, you change over time. A new shot every six months or year is an easy way to stay interesting.
  • Pre-process it yourself. Don't rely on the publisher/designer to crop, adjust, or apply any form of fixing. Send a photo you're happy with, and you won't have reason to complain when they publish it as-is.
  • Don't go overboard. Stay well short of this, and you're probably doing alright.

Permalink › | no comments

Mac Gamma

January 14

One thing I'm noticing is that colour profiles are crucial when working in Mac Photoshop, even if the final product is meant for on-screen display.

A long-time trick/crutch of mine on Windows was the screen capture. Alt + PrntScrn grabs a shot of the active window to the clipboard, which is easily pastable to a Photoshop canvas. I never even thought about it, I just did, and I did a lot.

Because the Mac handles screenshots a little bit differently (Cmd + Shift + 4 dumps a user-defined area to a PDF file, instead of the clipboard) I've had to start importing the screenshot instead of pasting it. This involves a dialogue asking if I want to save or discard the colour profile attached to the document.

I'm getting hung up on this is when I can't remember which instances require using the embedded profile, and which involve converting to the working space. I've configured my default working space in Photoshop to 'Color LCD' which seems to keep input and output consistent relatively, but OS X saves each screen shot to 'Generic RGB Profile' which produces a mismatch warning. By discarding the profile attached to the file and just using my default colour space, I'm theoretically working with the raw pixels, non-corrected.

comparing gamma

I'm still not confident I've got it though. Above is a shot taken as I was struggling to figure out which settings keep the colour consistent. The blues on the top are warmer, almost purple compared to the blues on the bottom. The top shot half is the screen shot, the bottom is the actual dock. Subtle difference, but this can be a big deal when you're dithering for transparency and colour-matching. Consistency is really important.

It strikes me as rather odd, though, that working solely with on-screen imagery requires such juggling of colour modes. Especially when I never had to think about it on Windows.

update: Brian Warren points out that Cmd + Ctrl + Shift + 4 copies the screen shot to the clipboard instead of dumping it to a file. Pasting that into an open document seems to alleviate any profile matching problems with screen shots. Excellent!

Permalink › | 22 comments

MT Comment Spam

January 12

So let's say you run a reasonably popular weblog that's open to comments from anyone and everyone. Let's also say in the same breath that you don't necessarily believe that turning off comments on older entries is a good way of squashing the comment spam problem, though it is terribly effective. For the sake of completeness, let's also say that you've bought into the idea that blacklists are inherently flawed, a losing proposition, and so you haven't bothered using them.

But let's also say that you have received a proportionately infinitesimal volume of spam despite it all, given how attractive a target your weblog must be.

What gives?

Well, I was going to tell you about the two days of battling spam I had this weekend. I was going to tell you how, by keeping on top of it, you could have eliminated the need for things like Blacklists and such. I was going to cover how spammers use Google to find open targets, and by reducing your spam profile you could have used other people's lack of security as a buffer to protect yourself.

But then I saw this (thanks Mark) and I realized that all bets are off. This is the same thing that happened to me this weekend. The free ride is over. Comments on older posts are getting turned off today, and comments on future posts will be scattered and available only for a short time. It was fun while it lasted.

I don't like it, but that's the sound of inevitability.

update: Six Apart released MT 2.66 on 01/14/04, which introduces new measures to combat comment spam.

Permalink › | 57 comments

Wanted: CMS

January 9

Wanted: recommendations for a proven, but simple open source CMS that's web-standards friendly.

The ideal candidate will work with LAMP (Linux/Apache/MySQL [or PostgreSQL in this case] /PHP). Function-wise, it should just be bare-bones, and usable through a browser. It will allow editing of multiple chunks of content that can be chained into a single page, although the chunks themselves will probably be pretty simple (a few headers and paragraphs, the odd list, nothing much more). At the same time, I'd like it to auto-generate things like site-wide nav and breadcrumbs and the like. No problem scripting those, as long as they're in some way possible. Free as in beer is a luxury, but not necessary.

Yeah, this is sounding like Movable Type. And in a way, that's more or less what I'm looking for. I'd like something that's page-oriented instead of post-oriented however, and although Matt's how-to would get me what I need, MT just feels like the wrong tool for the job. Zope is total overkill. I'm sure there are a plethora of others out there, but I'm interested in hearing about actual experience working with one.

And of course if it generates <FONT> tags out of the box, don't even bother mentioning it...

Permalink › | 90 comments


January 8

Well call me just plain out of it. Or blame it on my complete lack of Unix chops. Either may be accurate.

Until now I've been bouncing off of NetSol and putting up with the ridiculous (and highly inaccessible) image-based security widget every time I've run a whois search on an existing domain.

Google just launched a whois search, which is cool in and of itself (although the prospect of more spam doesn't thrill me). What caught my attention though, was an off-the-cuff remark by Joi Ito:

I guess this is useful for people who won't touch a command line, but I don't think I'd ever use it. I will continue to open a terminal window and type "whois".

Well now, OS X's terminal does whois. Nice!

Permalink › | 21 comments

Abstracting CSS

January 7

The further you abstract the structure of your markup, the weightier your CSS file becomes. It's inevitable, and by all indications, the way things are meant to go for two important reasons: it fulfills the complete separation of structure and presentation, and CSS is cachable—each new page load pulls new content alone, without pulling new presentation.

Structural purists would like you to write markup that looks like so:

Markup Listing 1

Of course, we all know that's simply an ideal. The reality these days is still something more along the lines of this:

Markup Listing 2
<body class="secondLevel">
<div id="container">
  <div id="mainContent">
    <h2 id="tagline"></h2>
  <div id="sideBar">
    <ul id="nav">
      <li class="home"></li>
      <li class="about"></li>
      <li class="contact"></li>

Why the disconnect? With gratuitous use of CSS2 and CSS3 selectors, we could very well achieve for Listing 1 the amount of control we'd expect to apply to the Listing 2 and its extra <div>s and classes. Except that, thanks to a few stagnant and under-powered browsers, we can't today.

It's easy to just say that things will be perfect when the proper selectors are supported across the board, but those who dismiss the issue that easily haven't looked hard enough at the implications of the incredible CSS-fu necessary to make the first example work. How do we hook into the second <li> of the <ul>, for example, if more than one <ul> exists on the page? The corresponding CSS for Listing 2 is easy:

#nav .about {}

Or simply, if there prove to be no conflicts with other elements on the page,

.about {}

What about the code in Listing 1? Consider the following snippit:

h3 + ul > li + li {}

Technically this selects all instances where a second li is immediately adjacent to the first, so it works for selecting the second one, but we'd have to go further and override the rest of the li elements in that particular list. Of course if we were trying to target the 8th element in any given list, that trailing group of li + li + li + ... selectors gets awful long. CSS3 minimizes this by introducing the :nth-child pseudo-element:

h3 + ul li:nth-child(8) {}

Neither of the previous two work in all of today's browsers. But we're exploring the future here, so stay with me.

Let's say we've written our selector to target the above-mentioned second <li> in the list in Listing 1, .about. Now we want to add modest formatting to the link within, say a style each for the :link, :hover, and :visited states. We'll forget :active for now, but assign the same style to :focus that :hover will get:

h3 + ul > li + li a:link {}
h3 + ul > li + li a:hover, 
 h3 + ul > li + li a:focus {}
h3 + ul > li + li a:visited {}

For comparison, the equivalent code for Listing 2:

.about a:link {}
.about a:hover, .about a:focus {}
.about a:visited {}

Now, so far we've assumed that there will only be one instance of the mentioned h3 + ul adjacency. What happens if our document is littered with this particular one-after-the-other combination? How would we target a single item and style the links we mentioned above? What if we're relying on a parent element (perhaps an id assigned to the <body> element) to specifically style the link on a per-page basis? Let's extend the above example to address these what-ifs:

#homePage h3:nth-child(3) + ul > li + li a:link {}
#homePage h3:nth-child(3) + ul > li + li a:hover,
 #homePage h3:nth-child(3) + ul > li + li a:focus {}
#homePage h3:nth-child(3) + ul > li + li a:visited {}

Clearly there's a problem here. As the markup simplifies, the necessary selectors increase dramatically. The current state of affairs says that the CSS equivalent of an old-school table-based design reduces the code's byte-size by roughly 50%. That's not a hard number, but the case is strong that file size savings are to be had. If the markup simplifies further, will the CSS complexity (and size) increase proportionately, or exponentially?

I've explored just this in the past. I'd even revise my numbers upwards—I now foresee writing CSS files of 80k to 100k and beyond. While comparing this experience to old style sites that wasted all their bandwidth in the markup is a valid point, I can't help but wonder about the future once we've stopped doing that.

There's a digression here about authoring files of that size, especially when you need to edit but one single selector. I'll save it, since I'd imagine most who work with 15k CSS files can easily see the consequences of that much more bulk.

There's another digression here about the selection methods used above, and how they're ultra-reliant on linear ordering. This again will be saved, though it bears investigation some time down the line.

Most of this isn't a problem now. Odds are it won't be for many years. But this is the ideal that we're supposedly moving toward, and I'm not quite sure how ideal it is.

Permalink › | 73 comments


January 5

Support the standards and nothing but the standards, regardless of whether or not browsers get them right?

-- or --

Support what standards are available given today's browser support, and kludge together markup/script/CSS hacks to overcome deficiencies in implementations?

Oh sure, 'use the best choice for your audience'. That goes without saying. But there's an underlying philosophy that drives your decision-making process, and I know there are people who fall into each camp.


Permalink › | 54 comments

This is Progress?

January 4

From an iBook on my lap, wirelessly connected to a router plugged into a cable modem connected to my service provider, wired into the internet backbone with countless hops between here and Nasa's web servers, which dish up live video feeds from inside their terminal, panning shots of live data feeds [time-shifted, of course] direct from Mars, and the only thing I could think to myself was "boy is this low quality."

One can only dream of the things we'll take for granted tomorrow.

Spirit has landed.

Permalink › | no comments

acronym vs. abbr

January 3

Pedantry aside, I'm going to be kicking myself in a few years when I have to strip all the incorrect <acronym> tags that litter my code today.

(To the curious: debates rage over the finer differences between <acronym> and <abbr>. Technically, you should be using <abbr> in most cases, except that MSIE doesn't support it, which means your work is going to waste. Then there's the accessibility side of things, which says that your good intentions in at least providing something to IE users is actually causing harm on the screenreader end of things. And there's the code issue which says you're using the wrong element to get the job done, which is a hack and will result in the scenario I began with. And finally there's the main accessibility bugbear which suggests the common screenreaders don't support either yet anyway. And all this over two seemingly redundant tags. Oy.)

Permalink › | 10 comments

A Third Strike for Apple

January 2

Let's review:

  • I bought my G3 iBook in September. My first Mac purchase ever.
  • One week after, Panther's release date was announced. I'm still on Jaguar.
  • Two weeks after, Apple announced an upgrade to the G3 iBook line: G4s for the same price.
  • Now it's coming out that the logic board on my particular model is flawed, to the point where there are rumblings about a class action lawsuit.

Apple has an upgrade program that gives users who buy before a certain date a cheap upgrade. I'd have been prefectly happy to pay for one, but I bought a week and a bit before the cutoff date. I called. It was recommended I submit a letter. I submitted. It was denied. I called again. Sorry, no chance.

Given the above list, the least I'd expect is Apple throwing me a coupon or a gift certificate or an OS upgrade or something. Apple is famous for poor customer service/appreciation, but I think this is crossing the line. I'm not a happy customer.

Listen, I know all this sounds like sour grapes. But I really want to like Apple. The operating system is excellent, the products are insanely well-designed, and the subversive anti-Microsoft sentiment is fun. They're just making it really hard for this first-time buyer to stay enthused.

I'll most likely be buying again in the near future and selling this iBook, since my needs have changed and it's not nearly powerful enough for what I need to do with it. The choices on the table are a 15" PowerBook (which, rumour has it, features an ugly spotted screen in some cases) and a cheaper Windows machine that will run circles around the PowerBook, performance-wise, and cost at least a thousand dollars less.

It's not looking good for Apple at the moment.

Permalink › | 48 comments

Happy New Year

January 1

There is a story I've always wanted to tell you. This is not it.

I can't say everything, in fact I shouldn't yet say anything. But I can refer you to the fact the Zen Garden's special notice has been down for some time now. I can tell you that a certain consulate in Buffalo has been moving in its mysterious ways.

I can tell you that family and friends are what's important, and if you do all you can every time to put them first, the rest is just details, and the rest doesn't matter, and the rest just works out.

2004 is already one of the best years I've ever had, and it has only just begun. Here's hoping yours is the same.

For those that are interested in such things as what other people read, my 2003 Booklist is up. It's about half of what I managed to read in 2002, but this past year was busier than most, to be fair. And there was no silly loser-makes-dinner month-long competition with my wife, also in the interest of fairness. (addendum — though she'll deny it to the bitter end, I won)

Permalink › | no comments