"I do find it odd that web developers tend to eschew the very browsers and platforms that the general public uses (such as IE on Windows) and reserve them only for testing. I'd feel better if these developers were as familiar with what users actually use on a day-to-day basis than only testing on them for projects."
I understand what you are saying here, but in some ways I disagree. Developers have a comfort zone, just as end-users do. It is really the job of QA to match this things and the developers need to merely be aware of the pot-holes they might encounter. QA is responsible for ensuring cross-browser/cross-platform compatibility and relaying the needs of the end user to the developer. The developer shouldn't be held responsible for the end user's desires to be outmoded or outdated. Just my thoughts.
End users don't always want to be outdated. Many users are trapped in corporate environments that keep them chained to IE6. Few web companies have a QA team, at least given the scale of many of the names building the web we use. And since QA doesn't actually run user groups (it's not part of the role), they cannot communicate user needs.
If a developer is skilled in the same platform as the end user, he or she has a better idea of gotchas. For example, I configure my main browser just as most clients do (not at all), so I can see potential issues (whacked fonts, missing colors, etc.) that might not be seen in my ideal or preferred configuration. This has saved my bacon more than once.
Again, you manage to bring bacon into the conversation.
I guess what you are saying is true. We aren't a large company but we do have a QA manager and people who are supposed to test every change to a given app. We are lucky that way.
Personally, I code and test using firefox. When it comes time to deploy, I'll test view changes (since model and controller changes require little if any cross-browser checking) in IE 8 on my VM. I could, conceivably check on other browsers on the Windows VM, but my time is allocated to solving problems, not testing.
And it's understandable that some users (especially in our case where some users are draconian hospitals unwilling to upgrade their browsers) will not be able to update themselves, but constantly programming for what are rapidly becoming edge cases can cut into more progressive development time.
If developing a web app for a client and the client's platform is restricted to some odd browser combo, it's not an "edge case." When you know what 100% of your audience is using, that's your target.
I think the developer should be required to test on the full browser suite. They can make decisions on the fly that result in the most appropriate semantic and structural markup and clean CSS/JS. Waiting for QA to raise an issue can cause a developer to have to ramp back up to a project, costing time, and potentially not recall some of the code decisions, resulting in stove-piping.
I also think developers should code by hand so they can come up with human-readable classes and IDs and make informed decisions about semantic and structural mark-up.
It's off-topic, yes, but it's driven out of the same place — developers aren't users, but they can certainly make an effort to see what users see.