In Praise of My “Legacy” iMac

A roundup of festivities on the 30th anniversary of the Apple Mac—L.A. Times—JAN. 24, 2014

Newer is not necessarily better.

I have been a Mac user since the first Macintosh appeared in 1984—with its tiny monochrome screen, its “whopping” 128K (1K being a thousand characters) of memory, a micro-diskette drive for disks that were not floppy but compact and rugged and could store 80K —and the Ma sported that new thing, the mouse.

I had not previously been tempted to buy a personal computer with their command lines, textual interfaces and lack of any capabilities not already better satisfied by the powerful mainframe computers I used in my work. Then I saw the ads, the graphics, and the point and click menus. I was hooked.

I never did much with that first Mac except write on it and I loved its WYSIWYG (what you see is what you get) displays, its ability to change format and fonts without imbedding any cryptic codes. I fancied myself a “desktop publisher”.

Then, over the years, Macs evolved. Their memories expanded, augmented by internal hard drives and micro-diskettes with greater storage capacity. I recall being astounded by my first internal drive that could store a megabyte of data—a million characters! It could hold a book!

Mac cases changed—repeatedly—growing and shrinking—screens lit up with color and grew in size. The microchip revolution was raging and each Mac could hold more data and process it faster—but all that was also true of Mac’s rivals running Windows.  

I remained a loyal Apple/Mac customer because it was easier to use and its basic applications—for mail, calendars—and then music—iTunes—delighted and made me more productive at home. At work, I still used Windows computers. They were difficult and cumbersome, requiring a support infrastructure to keep them maintained and operational. I had no trouble keeping my Macs humming along.

What ultimately hooked me was the rollout of desktop video editing. I had experimented with 16mm filmmaking as a youth and never pursued it because it was too expensive—for cameras, lenses, editing benches, stock and processing. I was unimpressed by VHS video, even as the cameras became more sophisticated and less expensive.

Two things happened around the same time—affordable feature-rich digital video prosumer camcorders and Apple’s Final Cut Pro (FCP) desktop editing tool. Now, I could shoot exquisite video and edit it on my desktop. I renewed my vow of customer loyalty to Apple.

FCP and the Mac co-evolved until the application became a virtual studio suite—Final Cut Studio—and the desktop computer became a self-contained 27” high-definition screen. My early experiments with digital video—short films—evolved into a growing production business and my second career.

I had everything I needed on my desk to write screenplays, to edit, finish and distribute High Definition video—all integrated seamlessly with the Mail, iCal, Office, and Quicken apps I used to run my business. I was a one-man band and studio, and loved it.

It all flowered for me in Final Cut Studio 3 and Final Cut Pro 7 running on Mac OSX 10.6.8—my beloved Snow Leopard operating system.

Alas, then Steve Jobs departed and Apple became a phone and media company. First, Apple betrayed its loyal base of Final Cut users, who had been integral to the Mac’s success, ceasing to support it and replacing it with a jazzed up iMovie that they audaciously call FCPX. The “replacement lacked key features and required a completely new learning curve. Apple’s loss was Adobe’s gain as video producers and editors migrated en masse to Adobe Premier and Adobe Creative Suite. I stayed gamely with FCP, even as Apple ceased to support it. It still worked, still did/does everything I need—until it didn’t.

Apple Mac architecture and Operating systems continued to evolve and at some point, FCP would not work anymore, so I stayed with Snow Leopard and eschewed system upgrades on my studio Mac.

When that Mac died, I replaced it with a refurbished vintage iMac that will run FCP under Snow Leopard, and have done so several times. Every day I back up my disk image onto an external drive. When the computer dies, I replace it in kind, and resurrect my familiar work environment from the backup.

I’m not a born-again Luddite. I still buy new computers. I have a newer iMac running the current Apple OS, mainly because other applications I use—like Chrome—no longer run on my old platform. I like Mac OS less with every release—it gets less accessible, requires more machine to operate, and runs more slowly.

My old Apple Mail program is no longer reliable, but I find the current version on my “new” iMac lacks features I’ve come to require. So, I’ve switched to Web Mail. The new version of iCal lacks features I depend on for time-management—like and integrate To Do list that lets me prioritize and sequence task before I commit them to the calendar. I still use iCal on my trusty “old” Mac.

I recently purchased a Windows 10—yes Windows–laptop because it’s a better buy than an Mac laptop, runs faster, and seems to require less support than the older Windows I used in my corporate IT days. My next personal desktop may not be a Mac, for the same reasons.

Like corporations that must keep old mission-critical business systems running on real or simulate “legacy” environments, so do I—for writing, video work, and trusty old iCal on my Legacy iMac. As long as I can find a Mac that will run it, I’ll keep the legacy alive.