I can strip my rig's components and put similar components back in. I've replaced hard drives, power supplies and video cards, along with RAM, and I know my way around the more recent drive types (SATA vs IDE, for the most part).
Once every year, I format my rig and rebuild my Windows installation from scratch. That includes partitioning my hard drive on my own and going through the rigmarole of making sure *everything* I could possibly need is installed before I try anything specific.
If something starts acting weirdly, I'm a little less self-assured. I can muck around the BIOS if I need to, but I always ask someone who's actually a techie as part of his job description before I attempt anything. I typically don't need said techie to intervene - I'm usually happy with a couple pointers.
Back in college, I took basic Comp. Sci classes, to find out that wasn't for me. I'm too much of a Lit geek to be able to sit through Javascript, HTML, Lingo or Flash and enjoy it, but I've done that nonetheless. I've brushed up and maintained my HTML and CSS, but I couldn't be asked to code a Flash animation or a Shockwave presentation to save my life. Javascript is *waaay* out back in a corner of my mind; I doubt I'd be able to do much with it if it weren't for the wealth of knowledge bases available online.
So I do know a little more about computers than your average Joe Blow, but I couldn't be expected to pull off a full-time IT position. You could say I know enough to not assume, like so many amongst my teachers and peers, that PCs are good only for hopeless nerds.
There's a Mac cult on campus; I swear. Besides, confronting anyone from the Esoteric Order of Steve Jobs is a waste of time. Not that it stops them from confronting *you*. It's like the idea that I'll use a PC to type out my thesis seems absolutely ludicrous to them!