(Marie-) Arzel was gracious enough to give me the green light to cross-post something from her chef-oriented blog. Power to the People…with Passion!
Btw: If you read take the time to read the “bitter” post, it’s worth your while to check out the more upbeat, and very practical stuff that keeps it company. The fact that she can rebound from a dented dream to craft occasionally dreamy, occasionally hard-nosed prose for others is only part of what makes her amazing.
At work, the conferencing software’s reach definitely exceeds its grasp. This was ably demonstrated on Monday’s “Area Staff Meeting,” originating in the Chicago office and streamed out to the provincials in the region. What you need to understand about our office’s “big conference room” is that: 1.) “Big” is a relative designation, and 2.) It was designed when a mere handful of old-timers were movin’ on up to that dee-luxe office suite in the sky-hi-hi.
Cut to 2011, and we’re spilling out into the adjacent breakroom, with the latecomers bringing chairs in tow. Which means that, to someone in the back, what comes out of the conference-call speakers is more than slightly reminiscent of the “Wha-wha-wha-wha-wha…” of Charlie Brown’s teacher. I suppose that could be a boon for anyone needing to sneak in some writing under the cover of note-taking. Except that’s far too productive. See, I figure that when a company goes to such great lengths to waste your time, the only responsible response is to take ownership of such wastage. Maybe—dare I suggest?—even profit from it.
Naturally, I mean having the foresight to set up a betting pool, with the winner having the best accuracy in predicting:
- Number of minutes between the official start time of the conference and its actual start
- Number of inside jokes that only the “host” office understands
- Number of times the video connection freezes or freaks out
- Number of times one or both ends of the voice connection drops
- Number of phone calls taken or hushed among the Powers That Be
- Number of PowerPoint slides that contain the word “vision,” “opportunity,” or “strategic”
- Number of remote workers dialing into the call who forget to mute their end of the connection
- Whether or not corporate I/T will push out a Windows update that requires a reboot in the middle of the presentation
(Belatedly, it also occurs to me that the above could be trivially adapted to a college drinking game. Uh-oh. Needless to say, I won’t be mentioning that to my co-workers: Given how our “keeper”—and I mean that in a good way—no longer bothers to lock the liquor cabinet, that could be exceedingly bad.)
The end of an NPR segment—to which Dennis & I were tuned in on the way to see The Ladykins on Sunday—ended with a clip from Lady Gaga’s “Born This Way” single. Serious as the segment’s topic had been, I (naturally) couldn’t help but smirk, thinking of Weird Al’s send-up.
Exposing yourself to parody is, doubtless, a mark of character—but it’s also a bizzare badge of honor in the music industry…at least to my way of thinking. But Dennis made a more sage point when he wondered aloud, “How many of the people he’s lampooned are here and gone, and he’s still around.”
Ouch. For someone with pretensions to being a “content creator,” that’s more than a little sobering. (That despite the poetry/song filk that’s whiled away any number of my Frivolous Friday evenings.) But in the unlikely event that writing superstardom awaits your faithful blogger, that’s one problem worth having.
I’m finding I comedy works best to keep me from dwelling on how much longer I’m going to be on an elliptical machine, treadmill or what-have-you. So earlier this week it was an old standby, Office Space. (If you haven’t seen it, suffice it to say that it’s sort of a cult classic for programmers.) Coincidentally, this was also the same week that someone decided to riff on one of the movie’s plot-points and steal fellow programmer’s red Swingline stapler. Twice.
I polished off Office Space and turned back to Monty Python and the Holy Grail, riffs from which are unavoidable in the SCA. That’d be like trying to play golf without at least one wink-wink-nudge-nudge reference to Caddyshack.
Bad enough that Dennis & I have already trained each other to phrase “or”-type questions (as in, “Do you want four cheese or meat-lover’s supreme?”) without expecting the answer to be an obligatory “Yes.” Or “True” or “1” if someone’s feeling exceptionally nerdy. But then I made the mistake of remembering the phrase, “Darmok and Jalad at Tanagra.”
(If you’re not a Star Trek: The Next Generation maven, here’s the schtick: The Federation has bumped into (yet another) alien race that (surprise!) just happens to be recognizably humanoid. Moreover, the Universal Translator can even babblefish—yes, I just used that as a verb— their language into English words. Problem is, it still doesn’t make sense, because the Tamarians exclusively communicate allegorically—meaning through references to stories from their history. Think of it as tribal knowledge on steroids.)
At first I thought, well, we’re not that bad. But then I realized—particularly after being chagrined at how much of the “Brave, Brave Sir Robin” song I’ve forgotten—that any nerdery is a continuum. Meep! Ummm…how many restroom stops until Tanagra?
Maybe it’s that I’ve been reading too much non-history, non-fiction lately. Or maybe the topics are just too…shall we say…inbred. But I’ve bumped into enough mentions of a game called “Ultimatum” that it’s stuck with me. The word “game” is a misnomer, at least in the sense that Ultimatum is nothing you’ll find keeping Monopoly company on the closet shelf. It’s actually played by those who’ve volunteered for psychological studies in universities and other institutions that study human interactions in factor-controlled circumstances.
The basic premise starts with two people. Person A is given a fixed dollar amount (usually ten bones in the cited examples) to be split with the other person. There is no negotiation—Person A makes a take-it-or-leave-it offer for Person B. The catch is that if Person B refuses, each person receives zero.
The classical economics they teach you in high school and college would predict that even if Person A offered Person B one cent and kept the remaining $9.99 for her/himself, Person B would still have a penny more than s/he had before, and would therefore accept. Because something is always better than nothing, riiiight???
As it turns out, capital-R reality doesn’t exist to fulfil the premises of classical economic thought any more than it does, say, story problems in Math. Because the ultimate result was that a low-ball offer basically meant that Person B had very little to lose, either. And, maybe it’s just because the Puritans gained such an early toe-hold in the American psyche, but the impulse to punish high-handed greed is fairly strong, too. In practice, 50-50, 60-40, and even 70-30 splits had a fairly high likelihood of being accepted. But once a threshold of “unfairness” was crossed…not so much.
The metaphor to the current state of the U.S. economy (and political state) seems all-too-obvious…
- As banks sit on hundreds of billions of dollars of bailout-backed credit
- As corporations hoard even more than that in profits, waiting for someone else to create the jobs…and demand for their own product
- As pay is not so much a fraction as it is a logarithmic base of productivity
- As the cost of a college degree rises in tandem with offshoring and union-busting
- As pernicious unemployment and foreclosure rates undermine consumer confidence…and spending
- As we expect an entrepreneurial “creative class” to spontaneously emerge from generations taught to standardized tests
- As gerrymandering and astro-turfing polarize the electoral landscape
- As the concentration of wealth into a shrinking pool of bank accounts further tilts the political and legal table
Remind me again, what’s the point in earning good grades, putting in your 40 hours, paying your taxes, financing your upward mobility, investing for your long-term financial security, voting on schedule, etc.? At some point, the Social Contract has to be a win-win, rather than the game of Ultimatum that’s it’s rapidly becoming.
If, en masse, the American worker/consumer walks away from the deal, it might actually be good on some levels. Among them decreasing personal debt and a mom-n-pop entrepreneurial boom, and maybe—just maybe—an increased focus on quality of life. But apart from that…boom. The pity is that those who play the role of Person A in this “game” they’re playing will not walk away with nothing. At best, they’ll be less-rich. Once again demonstrating how freakishly carefully-controlled lab results can mutate in the wild.
Slashdot today ran a piece about the U.S. Government paying its own programmers half the going rate for contract programmers. The comments, at least early-on when I read them, tended to focus on the premise that contract programmers are paid extra to, well, go away on short notice. (I’ve worked as a “temp”—high-tech flunkie as well as office minion—and, frankly, I have no idea where that notion comes from, at least not if a temp. agency is involved.)
Me, I’d tend to place the discrepancy at the intersection of hiring freezes and the spend-it-or-have-your-budget-slashed-next-year school of fiscal “management” that I’ve seen in the private sector as well.
But speculation, however plausibly grounded in past experience is not the point. Combine nerdy quirkiness and stupefying levels of through-the-looking-glass bureaucratic “logic,” and the reasons could well fall outside the pale of our workaday norms. The most likely of those, to my way of thinking, include:
1.) Well, duh: People from the outside are always smarter
2.) Legendary public sector “job security” includes cubicle in lead-lined bunker and cryogenic suspension in the event of thermonuclear Armageddon
3.) Pay comparison doesn’t take into account standard government-issue solid gold laptops
4.) Coders willing to take lower pay to develop “secret government technology” cachet irresistable to fellow geeks of the preferred gender
5.) Pay differences easily offset by illegal kickbacks from soda and energy drink vendors
6.) Former college interns didn’t notice the “indentured servitude” clause in their NDAs
7.) Government I/T departments are the digital tar-pits where old COBOL and VB6 programmers go to die
8.) Uncle Sam’s coders are rented out as cheap off-planet labor for our secret extraterrestrial allies—and neural implants don’t come cheap, you know
9.) Daily flogging and haranguing by Grover Norquist & Tea Party to destroy self-worth
10.) Once-in-a-lifetime chance to hack Andrews Air Force Base and take Air Force One out for a joyride
Those Who Know Best asked me to train our Client Services folks on “my” application. Cross-pollination, to be sure—just more in the sense of folks in lab coats and latex gloves brushing pollen off carefully selected plant and brushing it on on an equally selected other plant.
But those were the extent of the specifications, leaving me to fill in the details. Which, naturally involved bribes with food, wine, chocolate, and randomly sorting the competing teams into their Hogwarts houses. That was to make up for the pre-class quiz that they were really good sports about. The first half assembled in the big conference room yesterday for the actual hands-on session.
No worries…I was ready with easily two hours of material to cover, during which Hufflepuff, Ravenclaw, Slytherin and Gryffindor would take turns at the console doing actual client-type stuff on a test system. In my experience, that lends itself to questions far more than having features demonstrated to you.
What actually happened was that we quickly realized that the way they support the clients on their application is not at all how I support mine. Most notably, when there’s a problem, I’m generally sticking my head straight head the database itself. Client Services, on the other hand, relies on the interface. Partly because those tools have been built for them all these years, and partly because some don’t have the software nor a knowledge of SQL (Structured Query Language), much less any idea of how data fits together. Some, particularly the most senior folks do, and I had made the shaky assumption that those skills were acquired by the usual on-the-job organizational osmosis.
Wrong assumption, obviously. Which, for anyone presenting, just might trigger a freak-out because the agenda had suddenly evaporated. Which normally means pulling the plug on the whole thing or completely free-wheeling. Both are valuable meeting skills. But then the questions started flying thick as, for lack of a fresher phrase, two worlds collided.
And you know what? It was straight awesome. The balance of the two hours zipped by as I was grilled and in turn tried to get into their heads. Sure, occasionally we’d dip into the software to illustrate something. But for the most part it was meta-information: What the overall client relationships are like, some of the frustrations of working in a distributed development environment (instead of the one-stop-geek that is me), what the process is like on the client side. Those kinds of things.
I’d do it all over again…and I may just have that chance when I work with the second crew a week from tomorrow. I can only look forward to the instructive chaos that will bring.
Company’s coming for at least part of the weekend. Hope yours is a good one.
We had an interesting bit of “training” over the lunch hour today. One of the deep-thinkers wired himself into our large conference room via two-way webcam, and—unloaded a couple decades of experience on us, which included the pendulum-swings between centralized and distributed computing fads, and also the dead-wrong predictions/assumptions committed by even the most forward-thinking the technorati.
For me, the money-quote was the prediction of an “information economy.” Our guest re-cast that instead as an “attention economy,” on the premise that information is only valuable if someone reads/views/hears (and, I would add, acts upon) it. Our colleague also theorized about our obsession with glowing rectangles (phones, tablets), and the apparent necessity of maxing out our attention bandwidth when it’s not satisfied with the work and people and general doings around us.
Those two notions (attention economy and voluntary information saturation) kind of meshed into the notion that, in terms of classical economics, we’re voluntarily debasing our own currency. (Most especially when those brain-CPUs are in paparazzi or “Farmville” spaces.) I suppose it wouldn’t be a big deal if Moore’s Law and the general premises of computing applied to the think-meat between our ears. Presumably then we could evolve to a state where our internal process monitors looked something like:
30% - Curing Disease
30% - Ending Poverty & Injustice
30% - Saving the Planet
0.0001% - How long are those eggs in the ‘fridge okay after their expiration date?
9.9999% - OOOOH—SPARKLY BALL OF TIN FOIL!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Me, I’m not holding my breath. But in the absence of unprecendented rates of brain evolution (or neural augmentation), our techniques for managing divided attentions will have to evolve to take up the slack. And, sadly, I’m not holding out any more hope for that, either. Not after seeing how stubbornly mainstream corporate culture invests in tired carrot-and-stick paradigms, years of disconnect between worker productivity and pay notwithstanding. Sigh.
But such cognitive fragmentation is something we need to start acknowledging in our work lives—and devising coping strategies for its corrosiveness. Particularly in my profession, where one is expected to toggle between blinders-on, deep-dive focus and collaborative brain-pooling in such an immediate and binary fashion. Anything less is living in denial. And, in the long run, the cost of living in that zip code is higher than anywhere else on earth.