What Is (a) WORK (Plan) Anyway?

Image by Anja🤗#helpinghands #solidarity#stays healthy🙏 from Pixabay 

Essential elements of work

For our immediate purpose, let’s define work as purposeful activity—mental or physical.

That definition scales up and down—we can speak of the work done making a mental decision, or writing a novel, or constructing a skyscraper. In all cases, it is purposeful mental or physical activity.

Who cares about work? Anyone who needs to estimate, plan, manage, or do it. The more you understand work in the abstract, the better prepared you will be to handle it in your life and on the job. As I wrote in a recent article, work might be virtually anything you do—including play.

I’m writing about work because I’m a process analyst and work is a kind of process or sub-process common to most other processes. Work-process is fundamental.

We typically itemize work as tasks—things to be done—on a To-Do list or in complex project or business-service plans. Most of us are concerned with tasks we need to do or have done, so let’s focus on tasks.

Let’s try something simple but complex enough to be worth a little planning.

Task: Wash your windows

You want to plan a housekeeping task–wash the windows and screens in your house. I’m going to keep it simple and assume you live in a single-story house with only one kind of window and no glass doors. What do you need to consider?

Assumptions

Available workers: yourself

Materials: Glass cleanser, All-purpose cleanser (for screens)

Resources: Bucket, ladder, spray bottle, wiper, squeegee, sponge, hose

Time available: one week (your in-laws are coming for a visit)

Work Estimates

These estimates are arbitrary and serve only as examples of itemized estimation. Your estimates might be higher or lower.

Work Size (how much work?)

  • How many windows/screens? 10 windows/10 screens

Work Effort (how much effort?)

  • How much effort (in minutes) to remove each screen? 10 minutes
  • How much effort to wash and rinse each screen? 10 minutes
  • How much effort to replace each screen? 15 minutes
  • How much effort to clean the inside of each window? 10 minutes
  • How much effort to clean the outside of each window? 10 minutes
  • Total Effort (Work Size x Work Effort) (550 minute = 9 hours:10 minutes)
    • Windows (10 x (10+10)) = 200 minutes
    • Screens (10 x (10 + 10 + 15) = 350 minutes

Miscellaneous Time (190 minutes = 3 hours:10 minute)

  • Breaks: 4 x 15 mins each = 60 minutes
  • Set up for each window (ladder and tools) 10 min
  • Total set up (windows x setup) (10 x 10) = 100 minutes = 1 hour: 40 minutes)
  • Replace soap/water in bucket (5 minutes x 4 times) 20 minutes

WORK DURATION (How long will it take?)

Minimum Duration (Work Effort + Miscellaneous Time)

(9 hours:10 minutes + 1 hour: 40 minutes) = 10 hours: 50 minutes

This is the least clock time for one person (you) to do the work.

If you had a second worker (and had the tools necessary) and split the work evenly, or if you divided screen and window work between two people, you could get the job done in a single workday.

Let’s assume you’re on your own and don’t intend to work an eleven-hour day doing nothing but windows. You could “chunk” the work and spread it over as many days as you have available before the in-laws arrive. Of course, each day increases the risk (weather, family emergency, whatever) that you won’t finish in time.

Let’s assume you decide to spread the work over two days, striking a minimal balance between your physical stress and your risk of bad weather.

Planned Work Duration: 2 DAYS

SCHEDULE

You block out five and a half hours on two days in your calendar.

There’s your simple work plan. I’ll take up tracking and managing work plans in later posts.

It took about one hour to plan eleven hours of work. You might not bother to plan a small task, and I wanted to illustrate the work planning process. When you are planning a project measured in workweeks, months or even years, plan to allocate a proportionate amount of effort and time to do the plan.

Plan the planning process.

There are a couple of key points here to which I will return in future posts:

Effort and duration are both measures of work in time but are different. I’ve seen to many project plans that confuse effort and duration — in doing so they often overestimate the effort and underestimate the duration.

As the saying goes, “Size matters!” You can’t effectively estimate effort without consideration of “how big” “how much” or “how many” things the work will address, whether the things are products (like cleaned windows) or services (like cleaning windows).

When considering work size, you also need to consider work complexity—both affect effort and duration. I’ll elaborate on work size and take up work complexity in later posts.

In Praise of My “Legacy” iMac

A roundup of festivities on the 30th anniversary of the Apple Mac—L.A. Times—JAN. 24, 2014

Newer is not necessarily better.

I have been a Mac user since the first Macintosh appeared in 1984—with its tiny monochrome screen, its “whopping” 128K (1K being a thousand characters) of memory, a micro-diskette drive for disks that were not floppy but compact and rugged and could store 80K —and the Ma sported that new thing, the mouse.

I had not previously been tempted to buy a personal computer with their command lines, textual interfaces and lack of any capabilities not already better satisfied by the powerful mainframe computers I used in my work. Then I saw the ads, the graphics, and the point and click menus. I was hooked.

I never did much with that first Mac except write on it and I loved its WYSIWYG (what you see is what you get) displays, its ability to change format and fonts without imbedding any cryptic codes. I fancied myself a “desktop publisher”.

Then, over the years, Macs evolved. Their memories expanded, augmented by internal hard drives and micro-diskettes with greater storage capacity. I recall being astounded by my first internal drive that could store a megabyte of data—a million characters! It could hold a book!

Mac cases changed—repeatedly—growing and shrinking—screens lit up with color and grew in size. The microchip revolution was raging and each Mac could hold more data and process it faster—but all that was also true of Mac’s rivals running Windows.  

I remained a loyal Apple/Mac customer because it was easier to use and its basic applications—for mail, calendars—and then music—iTunes—delighted and made me more productive at home. At work, I still used Windows computers. They were difficult and cumbersome, requiring a support infrastructure to keep them maintained and operational. I had no trouble keeping my Macs humming along.

What ultimately hooked me was the rollout of desktop video editing. I had experimented with 16mm filmmaking as a youth and never pursued it because it was too expensive—for cameras, lenses, editing benches, stock and processing. I was unimpressed by VHS video, even as the cameras became more sophisticated and less expensive.

Two things happened around the same time—affordable feature-rich digital video prosumer camcorders and Apple’s Final Cut Pro (FCP) desktop editing tool. Now, I could shoot exquisite video and edit it on my desktop. I renewed my vow of customer loyalty to Apple.

FCP and the Mac co-evolved until the application became a virtual studio suite—Final Cut Studio—and the desktop computer became a self-contained 27” high-definition screen. My early experiments with digital video—short films—evolved into a growing production business and my second career.

I had everything I needed on my desk to write screenplays, to edit, finish and distribute High Definition video—all integrated seamlessly with the Mail, iCal, Office, and Quicken apps I used to run my business. I was a one-man band and studio, and loved it.

It all flowered for me in Final Cut Studio 3 and Final Cut Pro 7 running on Mac OSX 10.6.8—my beloved Snow Leopard operating system.

Alas, then Steve Jobs departed and Apple became a phone and media company. First, Apple betrayed its loyal base of Final Cut users, who had been integral to the Mac’s success, ceasing to support it and replacing it with a jazzed up iMovie that they audaciously call FCPX. The “replacement lacked key features and required a completely new learning curve. Apple’s loss was Adobe’s gain as video producers and editors migrated en masse to Adobe Premier and Adobe Creative Suite. I stayed gamely with FCP, even as Apple ceased to support it. It still worked, still did/does everything I need—until it didn’t.

Apple Mac architecture and Operating systems continued to evolve and at some point, FCP would not work anymore, so I stayed with Snow Leopard and eschewed system upgrades on my studio Mac.

When that Mac died, I replaced it with a refurbished vintage iMac that will run FCP under Snow Leopard, and have done so several times. Every day I back up my disk image onto an external drive. When the computer dies, I replace it in kind, and resurrect my familiar work environment from the backup.

I’m not a born-again Luddite. I still buy new computers. I have a newer iMac running the current Apple OS, mainly because other applications I use—like Chrome—no longer run on my old platform. I like Mac OS less with every release—it gets less accessible, requires more machine to operate, and runs more slowly.

My old Apple Mail program is no longer reliable, but I find the current version on my “new” iMac lacks features I’ve come to require. So, I’ve switched to Web Mail. The new version of iCal lacks features I depend on for time-management—like and integrate To Do list that lets me prioritize and sequence task before I commit them to the calendar. I still use iCal on my trusty “old” Mac.

I recently purchased a Windows 10—yes Windows–laptop because it’s a better buy than an Mac laptop, runs faster, and seems to require less support than the older Windows I used in my corporate IT days. My next personal desktop may not be a Mac, for the same reasons.

Like corporations that must keep old mission-critical business systems running on real or simulate “legacy” environments, so do I—for writing, video work, and trusty old iCal on my Legacy iMac. As long as I can find a Mac that will run it, I’ll keep the legacy alive.

All is Process

There is no such thing as “you.” There are no things—only processes.

Image by WikiImages from Pixabay

What we think of as things— are actually our mental snapshots of processes that are continuously unfolding in time.

A rock seems to be a thing because it seems unchanging—it’s actually “rocking” along very slowly through its own life-cycle as a small sub process within larger geological processes.

Heraclitus, a pre-Socratic philosopher, is most often cited for his maxim, “No man ever steps in the same river twice.” It’s probably one of the Ur-philosophical memes of western civilization. You’ve read it or heard it from philosophers, gurus, and your stoner friends. Maybe you’ve even said it yourself.

Heraclitus used “river” to illustrate the primacy of process—ever-changing process. That “man” stepping into the river is a process too. The next time that “man” steps in that “river” neither will be the same. Process—“everything”— is always irreversibly changing—subject to “time’s one-way arrow.” All is change and motion.

You are a process. As Buckminster Fuller famously said, “I seem to be a verb.” We are all verbs.

You are a single instance of a human being— a process with an approximate cycle time of 80 years —from conception through death. Like every process, “you” are always changing—are not the same physical or mental human being that began reading this post. You change, in countless ways, on every level of your being, every moment. Being human is the most complex process in the known cosmos—worthy of profound awe and respect.

There is one primary parent process in our reality—the cosmos itself. It emerged from a single point as a big bang that’s banging still and will be through its own cosmic life cycle, which physicists are still trying to figure out.

Every phenomeno within this universe—space, time, matter, gravity, galaxies, suns, planets, life, you— is a sub-process that has emerged from and within that ongoing explosion. Sub processes emerge and unfold within parent processes. They are nested like Russian dolls.

You are nested within your family process, which is nested within human processes—political, socio-economic, and cultural—that are nested within myriad levels of Earth process—and so it goes up to the cosmos—the BIG DOLL. It’s not neat and hierarchical—nature loves networks. Processes interact and overlap in complex webs of relationship. A process may have many parents, siblings, and relationships, and likely has sub processes nested within it—just like you do.

Yeah, this is “deep stuff,” worthy of reflection and it’s all been said before, one-way or another. So, what’s my point here? What’s your take away as a human being, a systems analyst, engineer, manager—or whatever?

Process is primary; systems are not.

Process antecedes, precedes and supersedes any system we impose upon it.  A guru once observed that, “All systems are foolish.” In the sense that we think systems can actually control process that may be true.

Every thing is a sub-process and unfolds within a network of related processes.

This is—or should be— one of your primary axioms as you try to understand any process—anything. Nothing, no process, exists in a vacuum.

Don’t make the mistake of thinking you can understand any process in isolation—yourself or any person, any “thing”, any situation, any natural process or any human system.

You hear much lately about how “Content is king.” I would argue that, “Context is king.”

How many people are dysfunctional because their personal and interpersonal processes are askew or not synching?

How many relationships have you seen fail because one or both parties failed to comprehend the family, social, or cultural processes within which they emerged? Those parent processes shaped them and—like it or not—still operate within them.

How many business ventures fail because entrepreneurs fail to understand the community, market, or legal system within which the venture must operate?

How many political campaigns fail for the same reasons?

Throughout my IT career, I saw numerous software projects fail because they were developed without due consideration, understanding, and integration of the project’s process context. The software process used in development, the business process being automated, a parent application to be integrated, an operating system, a network, the sponsor, the regulatory, social, and economic systems in which it must operate—all need to be considered in a successful software project—or software venture.

This may seem like common sense and, in my experience, it’s anything but. What’s your take? Leave a comment . . .

Featured

Revolutions However Known

Systems, Process, and the English Major

In 1968, armed with a seemingly “useless” English degree, I was swept up and away into communication, computer, and information revolutions that since have rocked the world, and are still rocking it at full throttle. In “the Revolution (however known), I served as foot soldier, as mercenary, and led troops into battle as a program director, project manager, and team leader. This is how it happened.

By 1968, demand had spiked for people who could program and manage computer systems and telecom networks, information technology (IT) industries so new that there wasn’t yet an education infrastructure to supply them with trained workers. IT organizations had to grow their own first generation of programmer analysts.

Supply and demand intersected this story at a Bell telephone company that trained this “English major” (EM) first as a network manager and then as a computer programmer analyst. The EM learned a new language—assembly code—a cryptic 2nd generation language for programming third generation computers. Those huge amazing machines could do (or seem to do) multiple things at the same time. They read and wrote large volumes of data on magnetic tape drives as big as refrigerators and disk drives the size of washing machines. He wrote his first programs on cardboard punched cards—one card per instruction.

Within five years–the English major thought things had come nearly full circle. Now he typed code on a TV-like display terminal, writing in a 3rd generation “procedural” language—COBOL—that was essentially structured English. Another computer program translated “high level” Cobol into the “assembly code” he’d first learned to write so that the computer could execute it. That was a programming language revolution—from machine code to English—and a system/IT revolution—software compiled translation from English into machine code. These were early fruits of “the Revolution” that was changing—everything.

Over the next seven years, the English Major surfed the revolutionary waves across various emerging technologies—timeshare networks (computer time not condos), parallel programming with multi-tasking, and minicomputers. He ended the 70s with a sabbatical year and an extended meditation retreat to an ashram in India. “On the cushion,” he experienced new insights—that “everything is a verb”—a process; that every system is an attempt to control some process; that every process (or system) is actually a sub-process—and that includes computer programming, himself, and his mind. In fact, his mind seemed to have a lot in common with software.

That sabbatical was excellent priming for two simultaneous appointments—one full time as a lead analyst at Harvard University’s Office of Information Technology and one part-time as course designer and instructor at a post-graduate systems training school.

Mornings, he would try to teach liberal arts graduates (EM newbies) to think like computers in structured, logical, and relational terms.

The rest of the day, across the river at Harvard, he researched and prototyped new fruits of the Revolution: microcomputer systems, relational databases, structured development methodologies, 4th generation—non-procedural—programming languages, and expert systems—a step toward AI. He worked on a “virtual machine” in a networked system environment that extended off campus into something called The Advanced Research Projects Agency Network (ARPANET) that turned out to be a precursor to our Internet.

At night, as he planned the next morning’s lectures, the EM would reflect and ruminate upon the explosive progress of the new technologies that were inundating the culture—and his own mind. He saw human software process lagging far behind technology. He tried to render all this in relevant terms and concepts he could use to teach. One student observed, “You’re not just teaching us to program computers; you’re teaching us how to think—about everything!” He saw the graphs of revolutionary change accelerating, converging and veering toward the vertical and told his students (in 1983) that he could almost imagine how things might change until some time in the 90s. Beyond that, in the new millennium, he said, “It’ll be science-fiction.” He was right about that and here we are “in that future”—supercomputers, pocket computers, smart phones, and the World Wide Web.

Bursting with ideas, the English Major set out as a systems samurai consultant for ten years to apply some of what he’d learned at Harvard and in teaching. He used his 4th generation programming and relational database expertise to lead RAD (rapid application development) projects. The projects included systems that assisted AT&T through divestiture of its Bell-system companies; a metadata-driven COBOL code generator; a computer-assisted budgeting process for a Federal Reserve Bank; and a system for translating and migrating 4th generation programming code between Unix and IBM VM operating environments. The unifying theme in his work was visualizing software development as a process and automating or at least computer-assisting that process.

Software process and software process improvement was his sole focus in the final decade of the English Major’s IT career. He let technology—the “what” of IT—rocket past and beyond him while he focused on the “how.” He hadn’t given up on computer-generated software but realized its ultimate fulfillment lay with rapidly evolving AI.

Meanwhile, he saw an IT industry beset and burdened with development and support of an ever-growing inventory of human-programmed systems. He became intrigued with software process as a means to “program the programmers and software managers.”  

The English Major became an expert in the Software Engineering Institute’s Capability Maturity Model (CMMI) and the Six Sigma Black Belt process for quantitative process-improvement. He spearheaded one IT department within his global-financial-services employer in its rapid evolution to CMM Level 5, the highest capability maturity rating—the first American group in the finance sector to earn that distinction. His “reward” was the disbandment of that elite department and his own early retirement—and that’s a story for another post.

Dispirited and unwilling to return to corporate IT trench warfare, the English Major pursued a second career in writing and video production, where he still employs some “software” process best practices to improve productivity and quality assurance in digital media management and production. A process is a process. Making a movie—even developing a screenplay— share much common process with developing a software product—they are all Intellectual Properties (IP)—and they all benefit from some of the same best practices.

He’s never stopped thinking and journaling about systems and process, not the technology so much as the social science, psychology, and philosophy—and the diverse opportunities and potential benefits of “process thinking” both inside and outside the IT industry. He still reflects and ruminates on systems and process at night, and still dreams about adventures and misadventures in his IT career.

Now, the English Major has embarked on a third career—“coming home” to write about what he’s learned—as a blogger, copywriter, and author. He intends to focus on the human side of the software process more than the technology. The English Major plans to keep it simple and hopes to entertain, inform, and even inspire readers within and outside IT.