On my listserv GranneDev, we've had an excellent discussion over the last several days. It all started when I, a non Mac-user, sent out an email to the group from a man who had done a lot of research in favor of the idea that schools should use Macs instead of Windows (You can read his very interesting report at http://homepage.mac.com/mac_vs_pc/MACvsPCCombined.pdf).
Several people on the list responded with the same basic assertion; I'll use the first one the list received below to illustrate the general argument:
"The very idea that a school system should use Macs because it would benefit the students more is absolutely ludicrous. When those students leave the school and enter the Windows-based world they'll be completely unprepared." ~ J. H.
I think the opinion expressed by J. H. and several others is inaccurate and damaging to students. Once you think about it, the argument that we need to teach students Windows in order to prepare them for the real world just doesn't add up.
Let's say we had decided to teach students what was dominant in computing, since that's what they're going to be using when they get out of school. However, let's say it's 1991. They're going to learn Windows 3.0. Not 3.1, but 3.0. Except that Windows 3.0 isn't dominant at that time. It was popular, but so was OS/2. And so was the Mac. So which do you teach? And which applications?
Windows 95 came out in 1995. A completely different interface than earlier Windows. Windows 98 changed things around as well. Windows NT, meanwhile, had a completely different interface. Windows NT was designed for businesses. So should schools be teaching Windows NT? Then Windows 2000 had a new interface as well, and it was also for businesses. Now we have Windows XP, which is for both consumers and businesses, but it has a completely new interface for everything.
Mac OS is guilty of this as well. The changes from OS 7 to OS 8 to OS 9 were substantial, and now OS X is a complete rewriting of the UI.
Linux GUIs change as well. KDE 1 is very different from KDE 2. And so on.
And think about applications. Microsoft Office, AppleWorks, StarOffice, Corel Office … really, what's the difference? A few minor features. And the "bold" button is in different places. But it all does the same thing.
And let's also think about what is actually out there in offices. A lot of businesses use Windows 2000. But a lot also use Windows 98. Which do you train students for? Some businesses use Mac OS. And some—*gasp*—actually use Linux! And some use mainframes or DOS terminals! How is a school supposed to know for sure what their students are going to be using when they enter the workplace?
And remember, for a 9th grader who's going to college, that's almost EIGHT YEARS later! When I was a 9th grader, Apple ]['s were the rage. Should I have been trained exclusively on the Apple ][? How is a school going to know what is going to be in widespread use in eight years? Especially when Microsoft, Mac, & Linux change interfaces every two years or so?!
The larger point is that learning WIMP is what's important. And what's WIMP? Windows, icons, menus, pointing device. The basics of any modern GUI (graphical user interface). Students need to learn how to use a computer, not the computer.
Things change in the computer industry. Things change rapidly. It's quite simply a myth that there is some carved in stone, neverchanging entity called "Windows" that students should learn, because it's what the rest of the world is using. When I was still a Technology Coordinator at a high school several years ago, I bought in to the idea that students had to learn Windows. I realize now that it's not a good idea.
Let's teach students computers, not Windows, or Mac OS, or Linux. Let's teach students word processing, not Word, or AppleWord, or Corel Write. Let's teach students the Web, not Internet Explorer or Netscape. Let's give students flexibility by truly educating them, not lock them in to some false idea of "train once, good for a lifetime". Things simply don't work that way in the real world.