Skip to: Navigation | Content | Sidebar | Footer


Full Archives

Standards Resources for Beginners

August 25

Questions about beginners resources have been coming fast and furious lately; let's build a list!

I'm not the first to have been asked recently about resources for those new to standards, and I won't be the last.

But I have a hard time fielding these questions too, because I've stopped looking for (or even keeping track of) the types of resources that cover basic principles. I'm aware of WestCiv's great free courses. I know of MaxDesign's growing list of tutorials. I recommend DWWS to anyone who will listen. But comprehensive resources that assume no prior knowledge? It's been a long time since I've seen one of those.

So, rather than respond to each person individually and point them to books that are probably too advanced for the people they're trying to teach, I figured I'd open up this question for feedback and see if collaboratively there's enough knowledge to build a good list of beginner's resources.

Ground rules: any resource is fair game, online or off. Use discretion and ask yourself whether it's something that would be truly educational for a person with no previous web knowledge at all, or at least no experience beyond old-school tables. Try to make this a useful list for a true beginner, instead of a list of what you found interesting reading last week.

Permalink › | 71 comments

Re-discoverability

August 24

Analyzing the way I interact with my browser highlights some downright quirky behaviour on my part, and a findability gap that needs filling.

In my 7 or 8 years on the web, I've never used bookmarks. Sites change, URLs go missing, and all too often when I need to find something again I turn straight to Google (prior Altavista, prior Yahoo). The half second it takes to save and categorize a page so I can come back to it later just never seemed to pay off.

Solutions that offer bookmark exporting, sharing and archiving don't feel like they're solving anything. That's too much conscious effort I'd need to spend to save a list of links which may or may not be outdated when I need them.

A solution like TrailBlazer is what I'd be interested in, were it simply an add-on to search my existing browser's history instead of a stand-alone browser. Searching an automatically-generated list of recently-viewed information is far more useful to me than having to manually save each link I find useful. Anything I haven't been to in longer than my history covers doesn't need to take up space when I can find it (or an equivalent) on Google.

So because of my reluctance to take action myself, instead of bookmarks I've learned to rely on auto-complete in the address field. All the shortcuts I'd normally create as browser buttons instead get typed as fractional URLs, thanks to Safari's lightning-quick ability to fill in the most likely match. google.com has become 'g', mezzoblue.com has become 'mez', etc. etc.

In fact, auto-complete is becoming so ingrained I've started clearing my history to influence what pops up. Occasionally a typo here and there will redirect my short strings elsewhere, or manually typing in a longer URL will override the short saved copy. Then it's time to flush and re-build; because the auto-complete buffers I rely on don't have that many enries, I can justify a total refresh. It's a light-weight, disposable, and unreliable system... but it works.

I wouldn't advocate anyone else getting used to this way of finding things. This reluctance to use bookmarks highlights the problem with re-discovering local information. Google usually works just fine as my backup brain when I need to find a web resource. When it comes to information I know I've seen but can't describe, my local tools just don't cut it. Spotlight looks great, but does it search the right things?

My quest continues. I'm sure there's something out there that does this already, that will search local histories and resources to find recently-viewed information. In fact, I'm sure I read about it recently. I just can't find it.

Permalink › | 45 comments

A Bit of Transparency

August 23

PNG files would make transparency effects a snap, except for one tiny little quirk: support for the format is great in every browser but one. Unless, of course, you resort to using some of the hacks and workarounds now available.

A List Apart ran Cross-Browser Variable Opacity with PNG: A Real Solution almost two years ago, and since that time other methods have come to light offering similar solutions of varying stripes to work around Windows IE's lack of Alpha transparency support. (1, 2)

Since I'm a fan of valid code, I generally haven't bothered messing around with the proprietary solutions, and the extra weight of an .htc behavior file is a bit more than my personal preferences can handle.

When it came time to code certain elements of the new design of this site, I knew what I wanted to do: tile a 1-pixel, transparent PNG on :hover for advanced browsers to add a nicely transparent screen, and hide that effect from IE to pretty much ignore a :hover state. As I coded I realized I could use two images, one for browsers that knew how to handle PNG properly, and one for IE.

There are two spots where you can see these effects come into play—the logo in top left corner of the header displays a very faint blue screen on mouseover (which is subtle enough to be missed, which is okay by me), and the three big content panels on the home page do the same with a more obvious red screen.

The markup is basic enough, although an extra <span> was necessary to pull it off:

<ul id="contentNav">
 <li class="li2"><a href="/projects/">
  <span>Projects</span>
 </a></li>
</ul>

By layering the elements one over top of the other, three overlapping planes were created which I could use to manipulate imagery (from the bottom up: the li, a, and span elements). The label of each panel shouldn't be covered by the :hover effect so it sits on the top plane (the span), the transparent screen is applied to the middle plane (the a) but only on :hover, and the toned photo that makes up the background of each panel is applied directly to the li behind the rest of them.

With a nod to Jon Hicks' 3D box model example, this is more or less how the panels are constructed: (well, actually, on the live site the background image is instead applied to the ul, not the li, but I'm simplifying... stay with me here)

Pseudo-isometric 3D view of link layering

The corresponding CSS sets up the containing boxes, places the images, and positions everything just so:

#contentNav {
 position: relative;
}
#contentNav li {
 list-style: none;
 padding: 0;
 margin: 0;
 position: absolute;
 width: 231px;
 height: 94px;
 background: url(../i/ice/contentnav-panels.gif) 0 0 no-repeat;
}

#contentNav li a:link, #contentNav li a:visited {
 width: 231px;
 height: 94px;
 display: block;
}
html>body #contentNav li a:hover {
 background: transparent url(../i/ice/alpha-red-dr.png);
}

#contentNav .li2 a span {
 width: 56px;
 background: url(../i/ice/contentnav-projects.gif) no-repeat;
}

The html>body filter on the :hover rule ensures IE doesn't get any fancy ideas about the PNG, and everyone's happy.

Well, except for the fact that there's no feedback when the link is hovered in IE, a result I wasn't happy with. I tried setting a solid background color on :hover and block the background image, but that seemed excessive. Then I remembered an old-school hack for semi-transparency, using an on/off pixel screen to create a checkered, pseudo-transparent GIF. Check it out:

(caveats: these inline examples not BMHed, won't work in IE5, and you'll need to visit the site and have your browser grab the custom style sheet to get the hover effect. They're not pixel-perfect either, but you get the idea.)

The panel on the left is what Safari/Firefox/Opera users should see. The panel on the right is how IE sees it (regardless of which browser you're using). The left is preferable, but the right is a decent enough downgrade that I'm happy with it. (On an LCD monitor there's even a bonus optical shimmering effect that was unintentional, but kind of neat.)

Because of the filter that keeps the PNG away from IE, not much more was needed than to throw an alternate rule into the mix for IE, just before the latter rule that immediately over-rides it in every other browser:


#contentNav li a:hover {
 background: transparent url(../i/ice/alphafake-red.gif);
}
html>body #contentNav li a:hover {
 background: transparent url(../i/ice/alpha-red-dr.png);
}

Of course if I hadn't got all of this working, I suppose I could have gone the high-bandwidth route and just thrown out a brand new pre-rendered GIF on the :hover state for each panel. But this way keeps the pipes unclogged, and that's a good thing.

(And I'd be remiss not to point out that CSS3's opacity property is gaining support. But I wouldn't consider generally useful for production work quite yet, hence the images.)

Permalink › | 26 comments

Fitts' Law

August 19

Taking a page from the book of HCI specialists, mezzoblue v5 makes liberal use of large link target areas for the sake of easier use.

First introduced to me by Kevin Cheng, Fitts' Law states:

The time to acquire a target is a function of the distance to and size of the target.

On the web that means the bigger your link area, the better. It also roughly means that grouping elements likely to be used together is a smart idea, but the target area is what I'm interested in at the moment.

Once stated, it's a fairly obvious and intuitive principle that's all too often overlooked. The new header of mezzoblue puts the idea to work. Instead of maddeningly small target areas that surround only the link text itself:

Too-small target areas

I've adjusted the link boundary area to fill as much logical space as possible and create non-visible, clickable haloes around each link:

Much larger target areas

While it might have been possible to expand each even further into the boundaries surrounding it, the key word is 'logical'—the ability to click an area with no apparent link is more confusing than useful.

A larger clickable area means less precise mouse movement is required to focus on the link. Without impacting the design in any way (the halo is non-visible after all) the usability of the header improves. While the main goal is increased usability in this case, there are positive implications for accessibility as well. Those with motor skill disabilities may have difficulty using a mouse; larger areas to click makes the act easier, which makes a big difference to them.

A personal example: Windows XP comes with a utility called MouseKeys that allow those who are uncomfortable with a mouse to control a cursor with the keyboard instead. It's frustrating and inaccurate, but I'm forced to use it when on my PC due to the testing setup here.

My PC keyboard navigation skills have always been sharp, but MouseKeys are tough to operate efficiently; the cursor is either too fast or too slow. Smaller areas are much trickier to hover so larger links are helpful and appreciated, especially for important elements like primary navigation.

Though I'm using CSS to apply Fitts, the two sometimes collide. Secondary navigation on this site which appears in the red contextual bar underneath the header could use more space for instance, but the background image used for the faint shadow effect requires precise clipping. (Throwing in an extra <span> element could work around this if I were so inclined.) And I don't see a comfortable way to apply the principle to links within body copy, since a larger link area would shift the type baseline and cause ugly gaps between lines of text.

This application of Fitts is just a preliminary stab, there is definitely room for improvement. More reading on Fitt's Law.

Permalink › | 42 comments

Design Matters.

August 17

Though I realize this is largely preaching to the choir, I feel like sharing some revealing comments that have crossed my radar over the past few days.

I've observed and participated in arguments where functionality is touted as a prime directive, and visuals 'are just decoration'. There are people who genuinely don't understand the influence of how something looks on how it's perceived by users/viewers/audience/whomever.

Granted, functionless form is pure art, or worse. I believe in balancing the two, and strive to find ways to make them work together instead of at cross-purposes. But reactions to the two successive redesigns of this site over the past few months have been insightful. I believe they go a long way to illustrating that content requires presentation, and function requires packaging.

The statements below are collected from my inbox, and various sites around the web. Some are nice, some are not; the trend is more important than any individual comment, and there's definitely a trend.

"I just wanted to let you know that your NEW design is awesome!"

"The new Proton layout isn't finished yet is it? It's ugly as hell!"

"...better than his last "design"..." (quotes transcribed as written)

"I'm glad he changed it... I was starting to visit less often because some of my confidence was shaken... but it's been restored now :)"

"Indeed, much better than the last and back on form, fantastic work."

"Love the new design — didn't really like the last one (no offence!) but this new version is very readable and pleasant."

"I love v5; I can now justify coming back to mezzoblue."

"Love the new redesign. It's much better than the one you had before."

"When you redesigned to "Proton" I was initially disappointed with it to be honest."

"The new redesign looks good, I was getting sick of the old one not working properly in Opera."

A good design lends credibility. A bad design hurts it. These statements require no further explanation.

Addendum: Not to say that Proton didn't have its fans. Some did like it, and I've had more than one conversation about where it succeeded (and where it failed).

I'm not admitting defeat here; I tried many things, some of them worked, and some of them didn't. I've learned from those that didn't, and v5 is a natural progression from v4. Believe it or not.

Permalink › | no comments

Because…

August 15

Yet Another Redesign. mezzoblue v5 goes live.

Why yet another redesign so soon?

Because fixing problems with an existing design is work, and this site is a break from work.

Because I was weary of looking at what was here before.

Because nothing beats the satisfaction of having done work good enough that you keep sneaking peeks throughout the day, when you're supposed to be on other projects.

Because despite it being summer, some are stuck indoors anyway.

Because itches deserve scratching.

Because sometimes you win, and sometimes you lose.

Because the subtle details are overlooked if the first impression doesn't resonate.

Because brand is important, and change is resisted.

Because Jupiter aligns with Mars.

But mostly, this one's for me.

Permalink › | no comments

Live from SIGGRAPH

August 11

In the tradition of my running update during SXSW earlier this year, I'll be covering the SIGGRAPH experience as I'm here in LA today and tomorrow.

Though SIGGRAPH is focused on an industry somewhat tangental to most of what's discussed on this site, there is a web graphics path, which is why I'm here (if you're attending, I'm speaking in the web graphics room [501AB] on Thursday, Aug 12th sometime after 1:45pm—probably closer to 3pm. Come say hi).

The main difference between SIGGRAPH and every other conference I've been to so far is scale. This one is huge. 20,000 people huge. It's the largest conference in the computer graphics industry, and 'there are an overwhelming amount of things to do and see.

In reverse chronological order, here are some of the more interesting encounters so far. (Pardon any broken links, wireless here is spotty and slow due to volume, I'll fix them as I can.)


Friday

10:30pm — It's official: dead logic board. Luckily still under warranty, unluckily meaning that SIGGRAPH came and went without another update. Next year.

Wednesday

3pm — Dammit Apple. Not this. Not now.

I travel with my older iBook because it's tiny and actually fits on a plane's tray table. Not to mention that it's less valuable.

The same iBook that, based on serial number, definitely isn't supposed to be affected by the logic board problem plaguing older models. The same iBook I've already been burned twice by. The same iBook that is now randomly freezing with blurry lines filling the screen, and refusing to actually turn the screen on upon reboot.

My presentation is backed up on this server, so at least I don't have to go into emergency panic mode. Updates may not be as frequent as I thought though.

11am — I sat in on a few web graphics sessions revolving around navigation. The one that jumped out at me was a project called 'Okinawa Wonder' which required intelligent interaction with over 10,000 pages. Though it wasn't clear to me what the data was for, the problems were universal, and the metaphor they used particularly clever.

All data is mapped as a galaxy, each point a 'star'. Over time the frequently accessed data points spiral out toward the edges, becoming more prominent, and the less-accessed pages float to the center and eventually disappear. There must have been some magic happening I missed because the 'stars' themselves were images, and even though it was a minimal interface, 10,000 images stored in memory and rotating around a central axis in real time feels like a long shot.

The metaphor was expanded with further user-configured mapping techniques, notably 'constellation' and 'planet' modes which weren't explained, but almost don't need to be.

The demo was an interesting way of seeing how other people see data. It's this sort of application that the up-and-coming web application war needs to address; at the moment the only feasible technology to deploy the system on is Flash, or a custom plug-in like they've used in this case.

10am — A quick run through the Emerging Technologies exhibit tells me I'll need to go back and see more. Some of it was yawn-inducing, but there were some majorly impressive demonstrations.

A company I just noticed is also from Vancouver called Sunnybrook Tech is demoing an amazing new LCD technology. Your average LCD monitor is considered good if it has a 600:1 contrast ratio; theirs does 40,000:1. The images are ultra-vivid, especially in the darker ranges where traditional LCDs fall down. The secret is individual pixel darkness control instead of a backlight, and 16-bit brightness control (as opposed to 8-bit on a regular monitor).

There were a lot of 3D demonstrations, some visual, some tactile. Of note was a prototype of a moving platform which I can only describe as 'intelligent stepping stones'. Which movie was it, Star Wars Ep. 1 X-Men!, where there was a scene with someone walking across thin air, floating platforms rushing out to catch his feet just in time. It's sort of like that—three platforms with wheels rush into place to catch the person's foot, basically allowing them to walk in place without going anywhere.

One I didn't get to try was a tactile feedback simulator using air jets instead of wires and objects. A 3D scene was projected onto a table, allowing some basic interaction with resistance generated by compressed air, simulating actual objects.

A couple of LCD panels were mounted on poles, allowing someone to rotate them 360 degrees. For every degree of rotation the scene altered accordingly, simulating a complete walkaround of various objects.

Permalink › | no comments

Virtual Hosts for Dummies

August 5

Running your own local Apache server for development is a great idea, and even better if you've enabled local virtual hosts.

As demand for open source software increases, so do the options. Popular packages are frequently ported to different platforms, so it's fully possible to run a local install of Apache regardless of which operating system you use on your workstation.

The stumbling block is mainly know-how, which is fortunately an easy gap to fill. I am decidedly not a system administrator, but I've run various Apache installs over the past year — without much conviction I should note, so learning has been slow.

After this past weekend's rebuild, I got around to re-configuring a fresh copy of Apache Complete last night. Here are two tips from that experience.

Virtual Hosts

Virtual hosts enable you to intelligently run multiple sites on a single server. The useful side effect is that with proper setup, you can point your browser to www.whatever.whatever and load a local copy. My development site is now www.mezzoblue.dev, which works exactly the same as the .com, just faster. I don't even need a connection to work on it, because it's all local; all my PHP scripts and Movable Type templates work, and the local filesystem access is so much nicer than using FTP.

I've long been aware of their use, but never committed to learning how to set up virtual hosts in Apache. A conversation with Narayan of Etherfarm on his recent trip through Vancouver enlightened, and it's really ridiculously easy, to the point where I wish I'd done this last year.

Find your httpd.conf file, and then add this line somewhere near the bottom (there's a spot with example virtual server code, I stuck mine below that):

NameVirtualHost 127.0.0.1

You might want to run a search for 'NameVirtualHost' within the file before-hand to make sure it's not already set, or at least commented out with a preceding octothorpe (#).

Next add an entry for localhost pointing to the root of your web server, so that typing localhost in your browser's address bar continues pulling up the default site:

<VirtualHost 127.0.0.1>
	ServerName localhost
	DocumentRoot /Path/To/WebRoot
</VirtualHost>

And finally, for each individual virtual site you wish to run, add a new entry pointing to the proper directory. This is especially useful because, at least on Unix-based systems, this means it can sit anywhere in your filesystem. I'm used to dumping all sites in a special 'www' folder and pointing the web server to that; this way I can simply direct Apache to, in my case, /Volumes/Shine/mezzoblue/www and keep all articles, .psd files, and web files in one folder.

<VirtualHost 127.0.0.1>
	ServerName www.mezzoblue.dev
	DocumentRoot /Volumes/Shine/www/delhi
</VirtualHost>

Host Redirection

But here's the part I didn't read, which I should have—you're only halfway there once you've set up Apache. In a bout of doing one thing while having a hunch I should be doing another, I wasted far too much time configuring and re-configuring httpd.conf before realizing I also had to tell my computer where to look for this host I'd just invented. It's in the articles I linked above, but I missed this step. May you not make the same mistake.

Simply, all that's required is editing your local computer's hosts file so that your browser is able to resolve the fake names for your development sites. For each entry created in httpd.conf, a corresponding entry needs to exist in your hosts file. Instructions on how to do this vary between operating systems, but it's not hard to find tutorials. (The step-by-step instructions on this evolt article work perfectly in Panther.)

And That's It?

As long as the directories you've pointed your virtual hosts to exist, and have files in them, you now have customized local hosts for your sites, on your own computer. Make sure to restart Apache before checking, which may require command line access, depending on your install. Other than that, you're done.

Getting 403 errors?

If you've done everything above and can't access your new site, due to a '403 Forbidden' error, you have a permissions problem. If the directory/files you wish to load haven't been given read permission for all users, the web server won't be able to access them. As a general rule, public-facing directories and files should be at the very least something like 644 or (- r w - r - - r - -), and ideally 755 (or - r w x r - x r - x). If you don't know what this means, a quick introduction to Unix permissions in general, chmod in particular is in order.

Now, assuming everything looks alright, you may still get a Forbidden error. Turns out this is due to the execute permissions on all folders above the root of your virtual host on the server; namely, it must be set. For all of them.

If you have a directory serving as your root like so:

/Users/dave/Sites

...then every directory above Sites must have execute permissions set for all users. If you're feeling brave, a quick and easy way to do this on the command line is as follows:

cd /
sudo chmod -R 755 /Users

You will be prompted for your local computer administrator password. This will change permissions of everything in /Users on down the hierarchy to 755. Fine if you're in a fairly low-risk situation, like having a firewall in between you and the public internet (and trusting the local network your machine is attached to). Problematic if not. Otherwise, you'd simply do it on a per-directory basis:

cd /
chmod 755 /Users
cd Users
chmod 755 dave
cd dave
chmod 755 Sites

If necessary, add the sudo command before each chmod line and authenticate as necessary.

Permalink › | 40 comments

Reboot

August 1

"If there's one adjective to describe a brand new 15″ Powerbook, 'slow' should not be it." And so, with those words of wisdom from an Apple tech, I rebuilt my 2-month old system this weekend.

What had originally gone wrong, I surmise, is the migration of data and applications from the older iBook. Instead of doing a direct transfer, I was using a GUI interface for the Unix rsync utility called RsyncX which allowed me to perform incremental backups instead of transferring gigs of data every evening as I backed up my system.

Sounds good in theory, but where it went wrong was when I originally set up the Powerbook with a different default account name than the iBook. Because rsync preserves permissions (which are far more important in the world of Unix than in the land of Microsoft) and the permissions were mapped to the old account, data and settings were inaccessible until I dropped into the Terminal and chmoded my way free.

Okay, so, this also should have worked, in theory. But I suspect my fumbling through the data transfer and subsequent repair efforts caused problems that manifested in the following way: a 1.5Ghz G4 felt slower than an 800Mhz G3 with a similar amount of RAM. The system diagnostics CD told me the hardware was fine, so something a little more subtle was happening.

After using Windows for ten years I'm used to using a hammer where a screwdriver is better suited. The old standby when nothing else works is reinstalling your OS from scratch, and since I'm new enough to Mac ownership that I don't yet know a better method, that was my way out.

Gritting my teeth and making gratuitous use of my external Firewire drive (there is nothing greater than blazing transfer speeds; we're talking gigs per minute) I buckled down to re-partition the hard drive and install a fresh copy of the OS. Two hours of waiting for status bars, clicking the odd button, inserting new CDs, downloading essential third-party apps and I was back up to speed. Just about everything is back to the way it was before, with the minor exception of my NetNewsWire list. Whoops, forgot to save out my 130-odd subscription. Oh well, at least the important stuff is covered; it's time to find some new sites and prune the cruft anyway. (side note: if you don't have RSS auto-discovery happening on your site, I'm very annoyed with you at the moment.)

It will be a few days before I know for sure whether this made the difference, but it feels way, way faster already; enough so that a smile of relief has been plastered on my face all afternoon. Lesson learned: permissions are a cruel mistress. Treat them kindly.

Permalink › | 24 comments