Gripes and Grumbles

Dancing As Fast As I Can…

Well, I’m doing an absolutely miserable job of blogging these days, aren’t I? I’ll be honest, I’m feeling pretty discouraged about the whole damn thing right now. Maybe I went too long without doing it while the server was out of commission, or maybe chores and life and work have expanded to fill in the spaces blogging used to occupy. Whatever the reason, there just doesn’t seem to be enough time in the day for everything I need and want to do, and I’m once again struggling with a huge load of anxiety because I can’t get on top of it all. Even when I do find the time to pay attention to this little hobby — usually late at night, after Anne’s gone to bed — I can’t focus and I end up flailing away on the same paragraph for 20 minutes, unable to articulate whatever the hell it is I’m trying to say, and then I give up in disgust and self-loathing, remembering how the words used to flow so effortlessly and at such volume, I feared I’d never be able to get them all down. Now I fear the spigot has been shut off and I can’t find a wrench to re-open it. I hate feeling like this… constantly busy but with nothing to show for it, everything melting day to day into an undefined blur. Feeling like I never manage to finish or accomplish anything. Hell, I have four friends waiting on replies to emails they sent me days (weeks) ago, and I can’t even manage to do that. And we won’t even speak of my long-dormant ambitions to write things other than blog entries.

Gaaah.

Anyhow, if anybody is still bothering to follow this blog, my apologies for letting you down in the content department. I have a couple draft entries in the works that I hope to finish and post soon, and of course I have lots of ideas for things I’d like to do here. Whether or not I ever actually do them…

For right now, for this afternoon, the best I can offer you is this momentary diversion:

That Artoo… the little bastard can do anything, can’t he?

spacer

Ever Have One of Those Days?

angry-tuskenI seem to be extra surly today, owing to too much work for the allotted number of hours in my day, and a threatening headache that’s lurking just on the fringes of my perception. Apologies to anyone who’s found out the hard way.

spacer

It’s Bloody Cold, and I’ve Had Enough of It

star-trek_the-naked-time_frozenJanuary is a hard month in Utah. That’s when The Inversions come. The time when the world loses all its color and turns gray and filthy and indistinct. When the horizon seems to shrivel down and attach itself to buildings and trees and lampposts, like leathery skin with no flesh beneath it adhering to the bones of an ancient, starving man. In January, when The Inversions come, the world becomes small and hard… and very, very cold.

The Inversions. No, they’re not ethereal, soul-sucking monsters straight out of a Harry Potter novel, but they’re pretty damn close in my estimation. I used to tolerate them fairly well. But that was BD, before diagnosis. Things are different now, and January is much harder for me than it used to be. But I’ll get to that.

For my out-of-state readers who may be wondering what in the hell I’m on about today, I ought to explain that “the Inversions” — formally known as temperature inversions — are an annual phenomenon brought on by a quirk of the local climate and geography where I live. Essentially what happens is that, during the winter months, the air near the ground becomes stagnant and cools off, while the air higher up in the atmosphere remains warm, which is of course the opposite of how things are normally. Normally, wind currents would mix the two temperature zones up, but remember that the bottom layer is stagnant; there are no winds to speak of during this time of year, much like the doldrums sailors experience near the Equator. And so the cold air stays in place for days or even weeks at a stretch. And it gets very, very cold during these periods… damn cold.  As in “your taun-taun will freeze before you reach the outer marker” cold.

But wait, it gets worse.

Utah is a vast place, but believe it or not, most of it is uninhabited. Some 80% of this state’s population is crowded into a narrow strip of land called the Wasatch Front, which runs roughly 80 miles from Brigham City on the northern end to Santaquin in the south, with the state’s three largest cities — Salt Lake, Ogden and Provo — and their sprawling suburbs right in the middle. The Front is bounded on two sides by mountain ranges, so all these cities essentially lay at the bottom of a gigantic bowl. (Well, it’s shaped more like a trough, but the bowl image is a bit more illustrative for my purposes here.) Now picture this bowl filled with over two million people who are all driving cars and consuming electricity and trying to stay warm. Naturally, these activities all generate air pollution. And that layer of warm air up in the sky during an inversion is like a lid sitting on top the bowl, holding down not only the temperature, but also all that airborne pollution generated inside the bowl. Exhaust from cars and powerplants, smoke from fireplaces, god knows what from refineries and smelters and factories… it all lingers here in the valley during an inversion, growing more and more concentrated day by day until a storm front finally comes through and the savior winds punch a hole in that giant invisible Tupperware seal and drive all the frigid, mucky air away.

The Inversions have been a fact of life around here as long as I can remember, but they’ve been especially bad this year. An article in the Salt Lake Tribune last week noted that four of the five places with the worst current air quality in the entire country are right here in Utah, and three of those four locations are along the Wasatch Front. Doctors are warning of increasing danger to even healthy adults, in addition to the elderly and children they’re usually concerned about, and there’s a growing chorus of voices demanding that our politicians do something about it. But I don’t need newspapers to tell me what I see with my own eyes every time I look out the windows at work. From my offices on the 13th Floor, the Wasatch Mountains on the east side of the valley ought to appear close enough to touch. But for the past week, the mountains have been utterly invisible behind a grey scrim, and even the spires of Salt Lake’s Cathedral of the Madeleine, only a couple blocks away from my building, are mere shadows in the mist.

As worrisome as it is to be breathing filth, though, it’s really the cold that’s troubling me. It never used to, particularly. Oh, I didn’t like the cold, but I tolerated it quite well. I remember a time when I felt perfectly comfortable wearing only a t-shirt and a leather jacket. No more, though. I mentioned a while back that something has changed in my body over the past year and I no longer “run hot” the way I used to; I don’t know if it’s something to do with diabetes, a side effect of the medications I’m taking, or the result of losing a lot of weight and/or lowering my blood pressure. Whatever it is, these days I’m wearing long johns, layered shirts, and a cardigan underneath a goose-down parka, and I still feel chilly. Even when I’m indoors. Granted, it probably doesn’t help that my desk at work is located in a bump-out that sticks out the side of the building and is surrounded on three sides by glass; I would guess all those windows radiate heat into the cold air outside pretty efficiently, making it difficult to keep my area warm. Or it could just be my own perception. But whatever the explanation, I notice the cold settling over me as I sit at my desk, flowing across my arms and the tops of my thighs, and sinking into my fingers so the joints stiffen up and begin to ache. Lately I’ve been imagining myself as the unfortunate chap in the image above… immobilized beneath a rime of frost, waiting for a spring that seems as if it’s never going to come.

I hate it. I hate every miserable moment of it, feeling like I’ve grown weaker in some fundamental way, even though I’m in fact healthier than I was a year ago at this same time.

I finally understand why my dad has long fantasized about going to Hawaii during the winter months. I’ve been dreaming lately about heading south myself… along with all the other senior citizens who wear their sweaters year-round. And I hate that too. For someone who’s been fretting about getting old for a long time anyhow, this new development does not help the ego…

spacer

Aurora

aurora-shooting_popcorn
Not long after 9/11, a friend of mine asked me how I could still be remotely enthusiastic about the then-upcoming Spider-Man movie, or superhero stories in general. He was certain the entire genre was doomed — or at least its current cycle of popularity was — because they cut too close to the bone in their frequent depictions of apocalyptic events so similar to the ones we’d just witnessed in real life. Surely, he thought, audiences would no longer have the stomach for fantasies of this sort when it had just been so forcibly demonstrated to everyone that there really aren’t any mutants or god-like aliens or obsessive rich guys in tights who will save us when the towers start to fall.

I countered that people might want those escapist fantasies now more than ever… that superhero stories give us a way to imagine a different outcome to real-life horrors that are nearly impossible to wrap our minds around. To believe, if only for a couple of hours, that we aren’t alone in our moments of greatest danger, that help might still be coming when all the normal institutions and authorities seem powerless to do anything… that maybe we ourselves could make a difference under the right circumstances. I argued that going to a superhero movie in the wake of a catastrophe was a healthy kind of wish fulfillment, a momentary respite from the crushing knowledge that, in the real world, bad things happen and people die, and there’s not a damn thing any of us can do about it.

So what a brutal irony that the latest mass shooting by a whacko lone gunman should take place at a premiere screening of the latest superhero movie. But not just any superhero movie… the latest Batman movie. Batman — a superhero whose back story begins with a very personal incident of urban gun violence, and who, more than any other major character in this idiom, concerns himself with protecting innocent citizens from lunatics who revel in anarchy and chaos for their own sake. While other superheroes are saving the world or even the universe from vast armies or immense cosmic forces, Batman is in the streets, fighting it out on the micro level of individual human lives. Talk about striking close to the bone.

I wasn’t planning to write about last week’s events in Aurora, Colorado, because I figured everyone else would say all there is to say before I got around to it, and pretty much the same things get said every time one of these incidents occurs anyway. (And isn’t it incredibly sad that these things happen often enough that we can anticipate what will be said in the aftermath?) But I find I keep replaying the words of Christopher Nolan, the director of The Dark Knight Rises, in a statement he made following the shootings: “I believe movies are one of the great American art forms and the shared experience of watching a story unfold on screen is an important and joyful pastime. The movie theatre is my home, and the idea that someone would violate that innocent and hopeful place in such an unbearably savage way is devastating to me.”

I can’t say what percentage of my life has been spent in movie theaters (although it might be interesting to know, if there were some way of calculating it). I can tell you, however, that many of my most vivid and pleasurable memories revolve around them. I remember exactly where I saw most of my personal landmark films that came out during my childhood and teen years. My first two jobs were in theaters, first at a neighborhood single-screen movie house where I ran antiquated changeover-style projectors with carbon-arc light sources, then at a modern multiplex where I started tearing tickets and worked my way back into the booth. I went to a movie on my very first car date. (I took a girl named Sheryl to see A View to a Kill… real romantic, eh? She liked the Duran Duran theme song, at least.) My first date with Anne was the night we saw Teenage Mutant Ninja Turtles while she was home from her summer job at the Grand Canyon. Granted, we didn’t go out again for three years afterward, but technically speaking, it was our first date. And when the two of us travel now, it’s not unusual for us to seek out old or interesting theaters in our destination cities and take in a movie while we’re on our vacation, as when we visited the Castro Theatre during our last trip to San Francisco in ’08.

The point is, movie theaters have been a central part of my life for a very long time. To me, the Aurora shootings are as repugnant and, yes, blasphemous as if someone had opened fire in a church on Easter Sunday. (The fact that it happened in a Cinemark theater makes it all the more personal and violating for me, since that’s the chain I used to work for in my multiplex days. I can all too easily imagine what it would have been like, as a naive young usher whose definition of “crisis” was usually no more serious than finding a mop when someone dropped a 44-ounce Coke, to try and deal with a houseful of wounded and panicky patrons.)

And now of course the question is what will happen in response to this heinous attack. Gun-control advocates are calling for tighter restrictions on “assault” weapons (as if there’s any type of gun that doesn’t assault someone when you shoot it at them), while gun lovers are asking why there wasn’t somebody in that theater with a concealed-carry permit and an equalizer under their shirt. The same discussion we have after every mass shooting, in other words, and the results will be the same: the two sides will bicker for a while, repeating the same old arguments over and over again, spinning in tighter and smaller circles until we finally get distracted by something else, and then it’ll all spin out and go away until the next time.

For the record, I’m fairly indifferent to guns. Several of my conservative friends seem to have it in their heads that all liberals want to cross out the Second Amendment and do away with all guns, but this one doesn’t. I much prefer the First and Fourth Amendments, personally, and I cannot imagine myself ever owning any sort of firearm. But I really don’t give a shit if other people own them. The issue just isn’t anything that’s important to me in any meaningful way.

That said, however, I don’t get why anyone thinks they need a military-style rifle like the AR-15 (which, I understand, is a civilian version of the good old M-16 my uncle carried in Vietnam, only without the gizmo that allows for full-on automatic fire), or why it’s so unreasonable to place restrictions on the types of gun and ammunition private citizens can get their hands on, or the quantities. We restrict all sorts of chemical materials and pharmaceuticals because they pose a danger to society when they’re misused, right? So what’s the difference?

Also, I find some of the comments being made about concealed-carry in the Aurora situation downright laughable. When people say “somebody could have made a difference” in Aurora, what they’re really thinking is “I could’ve made a difference.” It’s a superhero fantasy of a different sort — they imagine themselves as John McClane, saving the day and winning the girl. But they forget one salient detail about Bruce Willis’ signature character: he wasn’t just some guy, he was a cop. And in fact, the only real-life instance I know when somebody with a concealed gun succeeded in stopping one of these whacko shooters was that incident here in Salt Lake’s Trolley Square mall a few years ago, and that concealed-carrier was — surprise! — an off-duty cop. Honestly, I just don’t trust some regular old yahoo with a handgun in his shorts not to shoot me while they’re trying to peg the bad guy. I mean, think about it: a dark movie theater filled with screaming, panicky people trying to escape, with your vision further obscured by the smoke or gas or whatever it was, and the movie still running in the background… do you really think Joe Schmoe, who’s probably taken at most a couple hours of gun safety at the community college, really has the skills to get the job done without causing more collateral damage? Sorry, I’ll buy Norse gods in New Mexico and men of steel from another planet over that one.

But none of that matters, because we know from past experience the gun laws aren’t really going to change as a result of Aurora. My worry, going forward, is that the movie-going experience is going to be forever tainted because of this asshole. Not because I personally am going to be nervous or looking over my shoulder all during the movie, although I’m sure some people will be. No, my concern is that the exhibition industry is going to go bananas and turn theaters into security checkpoints, with metal detectors and armed guards, just like airports and high schools. You want to talk about liberty slipping away, how about the liberty to go to a freaking movie without having to wait in a security line to prove you’re harmless? The truth is, these mass shootings can happen anywhere people gather in numbers greater than two. Today a movie theater, tomorrow a restaurant, or even — why not? — a church. So do we put metal detectors at the entrance to every public space that ever witnesses a violent crime? And even if we don’t go that far, what about smaller, seemingly minor steps that nevertheless lessen the whole experience of going out? Already some theaters are banning the wearing of costumes to premieres, a time-honored, harmless, and fun activity, as if ballistic body armor really looks anything like a Batman suit… or even a Star Wars stormtrooper outfit. And I’m willing to bet that policy will never get revisited, even if 20 years pass without any further problems in a theater. Just like the TSA is never going to be reined in, even though anyone with a lick of sense knows that taking off your shoes at the airport does nothing to make you safer. And we’ll put up with it, we “free and brave” Americans, because we’re scared and we’ll put up with anything if we’re told it’s for our safety.

I hate the 21st century.

Photo credit: AP Photo/The Denver Post, Aaron Ontiveroz, appropriated by me from here.

spacer

If You Don’t Explore…

A couple days ago, our esteemed colleague Jaquandor posted a rumination on the decline of science education in this country, hinging his thoughts around a lengthy passage from the book Space Chronicles: Facing the Ultimate Frontier by Neil DeGrasse Tyson. If that name doesn’t ring a bell for you, it’s possible his face might: Tyson is an astrophysicist who has hosted a number of PBS series in recent years. He’s also a tireless advocate of space exploration, and, in the view of many, the true heir to Carl Sagan in terms of being able to explain complex and exotic scientific ideas to a popular audience. (It is probably not coincidental that Tyson has been tapped to host a new version of Sagan’s landmark TV series Cosmos.) In any event, it’s worth reading Jaquandor’s entire post, and the full passage he quotes from Space Chronicles, but I’d like to reprint the segment of that passage I found particularly resonant:

There was a day when Americans would construct the tallest buildings, the longest suspension bridges, the longest tunnels, the biggest dams. You might say, “Well, those are just bragging rights.” Yes, they were bragging rights. But more important, they embodied a mission statement about working on the frontier – the technological frontier, the engineering frontier, the intellectual frontier – about going places that had not been visited the day before. When that stops, your infrastructure crumbles.

 

There’s a lot of talk about China these days. So let’s talk more about it. We keep hearing about ancient Chinese remedies and ancient Chinese inventions. But when do you hear about modern Chinese inventions? Here are some of the things that the Chinese achieved between the late sixth and late fifteenth centuries AD: They discovered the solar wind and magnetic declination. They invented matches, chess, and playing cards. They figured out that you can diagnose diabetes by analyzing urine. They invented the first mechanical clock, movable type, paper money, and the segmented-arch bridge. They basically invented the compass and showed that magnetic north is not the same as geographic north – a good thing to know when you’re trying to navigate. They invented phosphorescent paint, gunpowder, flares, and fireworks. They even invented grenades. They were hugely active in international trade over that period, discovering new lands and new peoples.

 

And then, in the late 1400s, China turned insular. It stopped looking beyond its shores. It stopped exploring beyond its then-current state of knowledge. And the entire enterprise of creativity stopped. That’s why you don’t hear people saying, “Here’s a modern Chinese answer to that problem.” Instead they’re talking about ancient Chinese remedies. There’s a cost when you stop innovating and stop investing and stop exploring. That cost is severe. And it worries me deeply, because if you don’t explore, you recede into irrelevance as other nations figure out the value of exploration.

This is the same basic thing I’ve tried to say so many times myself in my own ham-fisted way as I’ve written about the end of the shuttle program and James Cameron’s dive into the real-world abyss and the general indifference and apathy I perceive in so many of the people I encounter… especially younger people. I believe our species became something more than the rest of the hominids the day one of our kind, a hundred thousand years ago, looked to the horizon and wondered what was over there… and then decided to find out instead of just sticking around the familiar hunting grounds. Our country became what it is, in large part, because Americans embodied that same spirit: People wanted to see what was here to be found, and after we’d seen it and settled it and reshaped it (for better or worse), they then wanted to make it better through invention and discovery. We went to the moon for the same reason, to see what was there with our own eyes (as well as, admittedly, to score the bragging rights before the Russians did, but there were plenty of idealists involved in the Space Race, no matter that they got their funding from Cold War politicians). But somehow, in a shockingly short span of time, Americans seem to have lost interest in doing Big Things; we no longer want to spend the money or take the risks, at least not as a collective society. (It remains to be seen whether private enterprise and a handful of wealthy eccentrics can fill the gap.) We’ve redefined “innovation” to mean smaller cellphones and clever new ways of wasting time on them. A significant percentage of Americans now think science is a threat to their religion, or to their profit margins, or simply to their comfortable ideas about the world. We bicker endlessly about the best way to legislate other people’s morality while the highways crumble and our electric grid collapses. And nobody cares because there’s always a big sale on somewhere, and it’s more fun to go shopping for more cheaply made crap we don’t need than to actually think about anything substantive.

In my view, all of this is another way of “turning insular,” to use Tyson’s phrasing.  We may not be literally isolating ourselves as the Chinese did in the 15th century, but have no doubt, America is turning away from the horizon and shrinking into itself in a very real way. And, like Tyson, I find this deeply worrying. I am not a particularly nationalistic type; I find the flag-waving, “we’re-number-one” stuff distasteful as hell. Not to mention frequently inaccurate. But I grew up believing my country was in the forefront of certain things — engineering feats, technological advancements, space exploration, general scientific research — and while I may wish this country was more like Europe in certain respects (notably a sensible universal healthcare system and more interest in quality of life than working oneself to death), by Crom, I think the US of A ought to remain in the forefront of those things. And we’re not doing it. We’re not doing it because we glorify ignorance and wealth (especially when the two are combined), and we crave fame more than accomplishment, and we fear anything and everything we don’t understand, and quite frankly, we’re not doing it because pro-science people like myself can’t seem to convince enough of our fellow Americans that church is church and school is school, and, while each has its value, they concern themselves with different things and it’s better that they not be intertwined. I only hope we manage to pull our heads out of our collective rear end before this country completely degenerates into a banana republic watched over at night by the lights from the Chinese moon base…

spacer

So, That Previous Entry…

It was a little too much, wasn’t it? That’s what I’m thinking in retrospect, anyhow. My initial purpose in writing it was simply to vent about a situation that annoys the hell out of me every single year, i.e., the necessity to rope off the front of my property for an entire week because the small-time parades of my childhood have turned into a Big Damn Deal, and I’ve come to really dislike Big Damn Deals as I’ve gotten older. But once I got into the thick of it and tapped into my “Wonder Years voiceover voice” and started trying to spin out some deeper interpretation of what this event was all about, well, I think maybe I fell into my own bellybutton. Sorry about that, kids. I should’ve stuck more to the basic point.

I know I shouldn’t be feeling so abashed over this. It’s really not a bad entry, and it’s also not like I’ve never rambled on a little too much, or gotten a little grandiose in my unfounded claims, or published an entry that was only half-baked, before. Hell, I’ve been blogging almost a decade now; they can’t all be gems, can they? But lately, I’ve been been having so much trouble finding the time for this silly hobby — you may have noticed how infrequent my posts have become — that I guess I just want everything to be a home run to make up for the lack of production, you know? I read so many wonderful, insightful, sharp, powerful things out there on the ‘webs, and I want my own stuff to be like that. But very often, perhaps even most of the time, I know I fall short. And it bothers me. Deeply.

There are other frustrations as well. This used to be so easy, and so fun. I could dash off a thousand words on a moment’s notice about nothing at all, and feel satisfied that it was good. Or at least amusing. At least amusing to me. But now… now when I do manage to start writing something here, the words come so slowly and with such effort… it’s like I’ve run out of things to say, or worse, run out of whatever special thing I had inside that allowed me to say them. My mojo, for lack of a better word. And I fear that it might be a permanent loss. I fear it’s a sign I’m getting old, that a window is closing.

As pathetic as I’m afraid this is going to sound, I have to admit it: Simple Tricks and Nonsense is the last remaining vestige of the dreams I used to have of being a genuine creative writer, and to feel like I’m now losing even this… well, “frustrating” isn’t a big enough word to cover it. Neither is “shattering” or “terrifying.” What are you left with when the thing you’ve used to define yourself, the one idea that you’ve clung to in your deepest heart-of-hearts, ever since you were 15 years old finally slips away? I really don’t want to find out…

***

Incidentally, I wasn’t exaggerating about the inconsiderate jackasses leaving behind their garbage after the parade. A broken plastic lawn chair has been lying on the property line between my front yard and the senior-citizens’ rec center next door for two weeks now. The groundskeeping crew for the senior center won’t dispose of it, because, apparently, that’s not in their job description. And obviously the owner of said chair just assumes somebody else will take care of it for them. God forbid they should take responsibility for their own crap. And you wonder why I get so pissed off when the placemarkers start going up for that simple little small-town Fourth of July parade?

spacer

Get Off My Lawn! (Literally!)

When I was a boy, I thought living on the route of my hometown’s Fourth of July parade was just great. (Why, yes, I did like Frosted Flakes as a boy. Why do you ask?) But then, things were different then.

For one thing, the parades were held in the morning, and on the actual holiday, rather than on the evening of the day before as they are now. In those halcyon days of the mid-1970s, Riverton was just a sunbaked farm town where the local good ol’ boys whiled away their mornings over cups of joe and slow-burning butts at the counter of the local cafe, and there were as many tractors and combines running up and down the main drag as pickups and cars. Back then, our Independence Day revels began at dawn with the sounding of the yellow, barrel-shaped air-raid siren that used to be crouched on top of a telephone pole behind the town hall. Which just happened to be kitty-corner across the street from my house. If you’ve never heard one of those things, take it from me, there isn’t a living creature on this planet that could sleep through their unholy Banshee’s wail. I remember sitting straight up in bed with my heart hammering away inside my rib cage, year after year, and my dad answering the Banshee with an eruption of profanity that would have left human-shaped shadows singed into the walls, atomic-bomb style, if anyone had been unfortunate enough to be standing at the foot of his bed. After a couple minutes of this clamour, the siren would fall silent, and then, while our ears were still ringing, along came the old sound truck. This was a green 1950s-vintage panel truck with four huge, horn-shaped PA speakers mounted on the roof. The driver — I’m sure my parents could tell me the guy’s name, as he was undoubtedly one of those good ol’ boys from the cafe — would be yammering away over the speakers, exhorting everyone in town to get up and come on down to the city park for an old-fashioned pancake breakfast. My dad usually had some very specific ideas about what that guy could do with his pancakes; I don’t recall my parents or me ever going to that community cookout. The three-person Bennion clan always made our own holiday breakfasts.

Then it was time to get ready for the parade. Dad would put our lawn chairs out in front while most of the townsfolk were still wandering toward the park in search of pancakes, but I don’t recall there ever being any particular sense of urgency about it. Nobody would think of squatting on a lawn that didn’t belong to them, at least not without asking permission, or at least not until the parade was underway and everything became fair game. Around nine o’clock or so in the morning, the normally busy road in front of my house would become eerily still. And about 45 minutes later (the parade has always started about a mile from my house and it takes a while for the slow-moving procession to reach the Bennion Compound), the floats and marching bands and horseback companies and fire engines would begin to stream past. Teenaged beauty queens beamed at their neighbors, salt-water taffy and little boxes of Chiclets and Bazooka Joe rained down on the children lining the street, and the same antique cars and novelty acts we saw every bloody year would roll past, and the spectators would wave and clap and smile as if it were the first time. These parades of my hazy, sepia-toned memories comprised our friends, our neighbors, people we knew… they were family, often in a literal sense — it was a small town, after all — but always in a metaphorical one. Back then, the parade was a ritual that seemed to actually mean something; it wasn’t just a way to occupy the kiddies with gathering free candy for an hour (although that was certainly an aspect of it). The parade reinforced a sense of belonging to something: a place, a community, a town. And when it ended, there were old-fashioned, homespun activities all day in the park, cheesy midway games and hamburger grills and plastic wading pools filled with iced watermelon and friendly horseshoe-pitching competitions, all of it leading up to the big finale, the fireworks that would fill the sky just after sundown. Rude awakening aside, Riverton’s Fourth of July used to be a pretty low-key, and yet thoroughly satisfying, affair. It was cornball, yes, but it was also organic and homegrown, and it was good.

That’s how it used to be.

Today, I still live in the same old house, and the parade still passes right in front of it, but practically everything else has changed. Riverton is now just another anonymous suburb, with a population several times the size of what it was during my childhood. And our small-town Independence Day is now such a Big Damn Deal that it has to spread itself across two days instead of one. Now, instead of fun and games provided by the Lion’s Club and the local church wards and the familiar good ol’ boys, there’s a traveling carnival every year at the park, and concession stands selling national-chain fast food, and the fireworks are electronically synchronized and spectacular. Everything about the Fourth is bigger and more professional now, more sophisticated… and somehow it’s less than it was, too. It feels… commercial. Store-bought. It isn’t ours anymore, it’s just something we ordered on Amazon. As for living on the parade route… well, that’s turned into a royal pain in the tuchus.  The fun little small-town event that used to bring us closer together has metastasized into an overblown, stress-filled competition in which inconsiderate jackasses will do whatever they can to ensure themselves a seat, because there are now so damn many people living in this town and everyone wants to bring their kids to the parade for that free candy, but the route is still only a mile long, and seats are a precious commodity. People start staking their claims with chairs and coolers and yellow caution tape days before the parade — this year, they made their appearance a full week ahead of timet — and people just leave them there all up and down the road, unwatched eyesores, to mark their territory. The competition doesn’t end there, though; I’ve personally witnessed soccer moms jump out of their SUVs, toss aside someone else’s chairs, and set up their own in the same spot. The whole sad, sorry spectacle makes my stomach turn. It’s just a damn parade, people.

I don’t remember when this whole thing became such a BDD. It’s come on slowly, over the space of a couple decades, like that tired old saw about the frog in the pot of water that’s gradually heating up. I only know that for at least the past decade, I have been obligated to set out my own chairs at the first sign that the land-grab is beginning, or risk having squatters we don’t know and didn’t invite plant their crap in my park strip for seven days. Because they would, without a second’s thought. It isn’t that I mind sharing my frontage with others — hell, given my work and commute schedule, I don’t even get home until the stupid parade is half over, so somebody may as well use the space — but I do mind the way people don’t even bother to ask. They just swoop in and drop their junk and expect you to put up with their placeholders sitting on your property for a week, and then they and their rambunctious little carpet monkeys show up for the party you didn’t want to throw, and they get huffy as hell if you ask them to make room for you and your own invited guests, or request that they not make a hellacious mess with their Subway wrappers and Super Big Gulps and juice boxes. And inevitably when it’s all over, they leave behind a pile of garbage that I have to pick up and put in my bin, because these disrespectful freaking slobs apparently don’t see anything wrong with expecting strangers to clean up after them.

And I guess that’s the difference… in the ’70s, most everybody in town knew each other, or at least knew of each other. There weren’t that many people here, and we interacted with each other pretty regularly, so you couldn’t really get away with being an ass. Today we’re all mostly strangers, isolated in our cul de sacs and our hermetically sealed vehicles, and our hermetically sealed lives that mostly happen far away from the places where we cook and sleep. Nobody really cares anymore about inconveniencing somebody else, because they’re not likely to bump into you at the grocery store, and even if they do, they won’t recognize you. With a population count of nearly 40,000, how could it be otherwise?

The ironic thing is that the damn parade isn’t even any good anymore. It’s degenerated into little more than a long line of politicians in convertibles and jacked-up 4x4s with the names of businesses on their sides, and wave after wave of military and law-enforcement vehicles. It’s almost enough to make me want to stay at the office and put in some overtime…

spacer

What If You Went to the Bottom of the Sea and Nobody Cared?

One of the more depressing aspects of living in the current epoch, at least for me, is a nagging sense that the days of the Great Adventure are over. What do I mean by this? Consider: throughout much of the 20th century, larger-than-life men and women were constantly pushing the boundaries of how far, how high, and how fast human beings could go, either making or contributing to extraordinary scientific discoveries along the way, and all with the full attention and support of the general public. Viewing the popular movies and newsreels of decades past, and reading the contemporary pulp fiction (which I believe is often more representative of a particular milieu than the “good” stuff), you can really feel the shared sense of excitement ordinary joes must have vicariously experienced as daring aviators flew solo across the Atlantic for the first time, then circumnavigated the globe by plane, then broke the sound barrier and ventured to the edge of outer space; as intrepid explorers uncovered the tomb of Tutankhamun and located the legendary city of Macchu Picchu high in the mountains of South America; as hardy adventurers reached the poles and summited Mount Everest; and ultimately, as astronauts first stepped onto the surface of another planetary body. The word “progress” meant something unambiguously positive then, and it must’ve seemed to folks living in those heady times as if the human race was really going… well, somewhere. I personally came along a little too late to share in that zeitgeist firsthand, but even in my own youth during the 1970s and ’80s, I recall the public imagination being captured by the early space shuttle launches, by the first untethered spacewalk by an astronaut with a jetpack, and by Dr. Robert Ballard’s discovery of the most famous shipwreck in history, RMS Titanic, lying in the silent darkness two-and-a-half miles below the surface of the ocean.

Nowadays, though… things are different now. Here in the second decade of the 21st century, every square foot of the Earth’s surface has been mapped and photographed from orbit. Ancient cities lost for centuries in desert sands and steaming jungles can be pinpointed from air-conditioned rooms in anonymous suburban office parks using thermal imaging satellites. Any place on the globe can be reached by air in a matter of hours. African safaris and Everest hikes are vacation destinations for those who can afford them. And even distant worlds are accessible to the human race as never before, via our robot proxies and the information-sharing power of the Internet. And that’s all good, it really is. Many of those early adventurer/explorers I romanticize met with pitiful and/or horrific deaths because they had to be there in person, and the folks back home never got more than just a glimpse of the sights they saw and things they learned. Today, technology has made discovery much safer, and it’s made it truly democratic as well — everyone can view the latest photos from the Hubble telescope or the surviving Mars rover, or zoom in on some section of the globe at the click of a mouse. People can even participate if they like, though projects like SETI@home. But the trade-off, unfortunately, and the irony as well, is that just at the moment when the average citizen can become more involved in this sort of thing than ever before, not many people seem to care anymore. Exploration and discovery seem to have become, at least as far as I can tell, a niche enthusiasm that attracts a relative few, rather than a society-wide concern.

Why else would there have been so little apparent interest three weeks ago when James Cameron — yes, that James Cameron, the writer/director of Titanic, Avatar, and, somewhat prophetically, The Abyss — joined the ranks of the great explorers by riding a revolutionary new submersible to the bottom of Challenger Deep, the very deepest point in all of Earth’s oceans? To my mind, this was a Big Damn Deal. The sort of thing that strangers on trains should’ve been talking about for days afterwards, worthy of front-page articles and magazine covers. Instead, it seems to have been a mere blip on the cultural radar, duly noted and then shoved aside with the turn of another 24-hour news cycle. There are follow-up stories out there, but you have to seek them out if you’re interested. And my inner cynic can’t help but wonder with a sour grumble just how many of the mouth-breathers walking around out there actually are interested. Neither he nor I like the odds much.

To be fair to the mouth-breathers, though, a big chunk of the blame for the indifference that surrounded this story must be thrown at the media. There wasn’t much news about Cameron’s plans beforehand — I myself only heard about the expedition by chance a couple weeks prior, via the blog Boing Boing, if I remember correctly — and, as I said, the coverage of the actual dive has been perfunctory at best. I guess a good old-fashioned adventure is just not that important at the moment, not when there’s an endless race for the Republican presidential nominee to focus on, and hey, did you hear Snooki’s pregnant, and of course Facebook just bought Instagram, whatever the hell that is. If people who don’t follow certain types of blogs aren’t hearing about expeditions like Cameron’s, why should they care?

I also wonder if perhaps part of the problem is James Cameron himself. My mother’s reaction when I told her about the expedition was something to the effect of, “Why him?” And I imagine that’s not an unusual reaction. He’s a filmmaker, after all, not any sort of scientist (although the National Geographic Society has named him an explorer-in-residence, and he’s made over 70 deep submersible dives in the last couple decades, which I think qualifies him for this). That “king of the world” thing at the 1998 Oscars still sticks in some people’s craws, and he has a reputation for being a royal son-of-a-bitch to work with. But hey, let’s be honest: I think a certain degree of arrogance is probably a requirement to doing something like this. You have to believe that the thing can be done, and you have to believe you’re the one who can do it, and both require a sizable belief in oneself. In this case, Cameron wasn’t the first human to journey into the Challenger Deep — two men did it in 1960 with the help of the U.S. Navy and a submersible “bathyscaphe” called the Trieste — but he is the first to do it in 52 years, and the first to do it solo. And the conditions he knew he’d be facing were pretty daunting, even with a half-century of technological advancement since the Trieste.

Cameron’s submarine, the DEEPSEA CHALLENGER, dropped seven miles straight down into the Pacific Ocean, the downward journey taking close to three hours while his six-foot-plus body was folded into a steel sphere only 43 inches in diameter. The pressure outside grew to an astonishing 16,285 pounds per square inch — barely less than the pilot sphere’s rated capacity of 16,500 psi — pressure so intense that the sub actually shrank in height by a couple of inches. Meanwhile, the temperature inside Cameron’s sphere fell from uncomfortably warm near the surface (because of the electronics and Cameron’s own body heat in such a confined space) to meat-locker cold at the bottom of the sea. And of course it was pitch black at the bottom. He was all alone in utter darkness farther below sea-level than Mount Everest rises above it, trusting that the engineers who designed and built DEEPSEA CHALLENGER hadn’t overlooked anything. In other words, this situation was very much like a flight into space… and as much as I admire astronauts for their drive and guts, I admire James Cameron for his.

The Sunday he went down, March 25, I was following along on Twitter, a service I normally find rather silly, but that day it was the only place I could find any news. I was on the edge of my seat as each new update came in from the expedition, ticking off the latest depth he’d reached, the time elapsed since he’d submerged, etc. And when Cameron’s own tweet flashed across the Internet — “Just arrived at the ocean’s deepest pt. Hitting bottom never felt so good. Can’t wait to share what I’m seeing w/ you” — I exhaled a breath I didn’t know I’d been holding, and thought of the words of Charlie Duke, the CAPCOM on the Apollo 11 mission, when Neil Armstrong radioed back that the Eagle had landed: “Roger, Tranquility. We copy you on the ground. You got a bunch of guys about to turn blue. We’re breathing again. Thanks a lot.” (Sidenote: How bizarre is it to think that a man was able to send a “tweet,” surely one of the most frivolous means of communication ever invented, from the bottom of the ocean? We really are living in the future, aren’t we?)

I don’t know… maybe a moment like that doesn’t do anything for you. Maybe this really is just one of my esoteric and slightly backward interests, like old movies, something that the vast majority of the population no longer has any use for. Another example of how I should’ve been born a generation or two back. These days, there are a lot of people out there who feel we shouldn’t bother trying to put human beings into space or other hostile environments; it’s too expensive, they say, and too dangerous to justify what we get back, and anyhow we can learn all we need to know with cheap, efficient robot probes. I don’t know if these people are in the majority. They certainly seem to have the loudest voices sometimes. And that just makes me sad, and frustrated. Because the world of the early 21st century feels too bloody tame to me. I’m so grateful that every once in a while, somebody like James Cameron comes along and does something to demonstrate that there are still frontiers to be crossed, and it’s much more interesting to cross them in person, if only somebody is willing to cross them.

deepsea-challenger

 

spacer

Nobody Is Safer

nobody-is-safer

Over the past week, the British magazine The Economist has been hosting an online debate between security consultant (and highly vocal TSA critic) Bruce Schneier and former TSA administrator (and current TSA apologist) Kip Hawley over whether, in fact, post-9/11 airport security procedures have done more harm than good. My own views line up nearly one-to-one with Schneier’s: I think the rigamarole you have to go through to get on a plane these days is needlessly demeaning, intrusive nonsense designed to make it look like the government is doing something to make traveling safer, but which ultimately accomplishes little except inconveniencing and intimidating travelers. (For one thing, all the procedures are designed to stop whatever the last would-be terrorist attempted to do; logically, that just means the next attempt will be something new that the TSA’s not screening for.) I could go on at length about this, and about how incredible I find it that a people who genuflect to the concept of individual liberty are so willing to simply “hand over their papers” (so to speak) when somebody in uniform demands them, as long as they think they’re doing it in the name of their own safety. But instead I think I’ll just quote the final two paragraphs of Schneier’s closing remarks:

The goal of terrorism is not to crash planes, or even to kill people; the goal of terrorism is to cause terror. Liquid bombs, PETN, planes as missiles: these are all tactics designed to cause terror by killing innocents. But terrorists can only do so much. They cannot take away our freedoms. They cannot reduce our liberties. They cannot, by themselves, cause that much terror. It’s our reaction to terrorism that determines whether or not their actions are ultimately successful. That we allow governments to do these things to us–to effectively do the terrorists’ job for them–is the greatest harm of all.

 

Return airport security checkpoints to pre-9/11 levels. Get rid of everything that isn’t needed to protect against random amateur terrorists and won’t work against professional al-Qaeda plots. Take the savings thus earned and invest them in investigation, intelligence, and emergency response: security outside the airport, security that does not require us to play guessing games about plots. Recognise that 100% safety is impossible, and also that terrorism is not an “existential threat” to our way of life. Respond to terrorism not with fear but with indomitability. Refuse to be terrorized.

The whole of the debate is worth skimming, although I remained totally unconvinced by Hawley’s arguments, which seem to basically consist of “hey, nothing’s happened, so we must be doing something right!” and “we’ve had lots of successes, we just can’t tell you about them.” I found Schneier’s comment that airports have become effectively “rights-free zones” where TSA “officers” can do pretty much anything they want to you and your belongings in the name of “security” especially trenchant… and chilling. Just lately, though, I’ve been seeing some signs that the tide may be turning, that people may be regaining a bit of sanity a bit on this subject, or perhaps they’re just getting tired of minimum-wage rent-a-cops feeling up their grandmas and confiscating their baby formula. Either way, I fervently hope we’re eventually going to ratchet things down to something that more resembles the way it was when I first started flying.

It’d be lovely to be able to go to the airport for a hotdog and an afternoon of people-watching again…

spacer

John Carter: Dead on Arrival?

As I mentioned in the previous post, my passion for the movies — or at least for going to the movies — has faded somewhat in recent years. I think the biggest problem is simply the reality of a busy semi-grown-up life. My schedule on weekdays makes going out inconvenient, and the weekends tend to get eaten up with all the mundane crap I can’t manage to complete during the week. Basically, it’s just damn hard to carve out a couple of hours to sit in the dark without feeling anxious because I think I ought to be doing something else. In addition, the general theatrical experience has really deteriorated since my multiplex days, largely due to the breakdown of good manners (Text-messaging! Grrr!) as well as various exhibition-industry developments, such as those abysmal pre-show reels of commercials and fluffy “behind-the-scenes” segments that don’t tell you a damn thing except how great everyone was to work with. And then there’s the not-inconsiderable problem that Hollywood just doesn’t seem to be making much I want to see these days; I’ve apparently aged beyond the industry’s target demographic.

The end result of all these converging factors is that I rarely get too excited anymore about upcoming movies. The last one for which I remember feeling much of a build-up was Indiana Jones and the Kingdom of the Crystal Skull, and even then my eagerness was somewhat tempered compared to other movies in years past. I guess I’m finally beyond the running-countdown-clock, have-to-see-it-on-the-first-day, standing-in-line-for-hours, midnight-screening thing.

But every once in a while, something will grab my interest enough to trigger some vestige of the old anticipation reflex, and in recent months that film has been John Carter, the long-awaited cinematic adaptation of some of the best-loved pulp-adventure fiction of the early 20th century, namely the “Barsoom” novels of Edgar Rice Burroughs. I dearly loved those books as a boy, and I’ve gone from initially dubious to cautiously optimistic that the film’s director and co-writer, Andrew Stanton of Pixar fame, might have actually made a movie version that’s at least somewhat faithful to the source material. Certainly the look of the film is right, based on what I’ve seen in the trailers, and I’m hoping that the tone will be as well.What I’d like to see is old-fashioned, swashbuckling fun and romance, the sort of thing where the hero has a twinkle in his eye, rather than the self-important Dark ‘n’ Angsty Very-Important-Epic(tm) that every genre film these days aspires to be. That tone was appropriate for The Lord of the Rings, but not for anything created by ERB.

Unfortunately, my own feelings aside, John Carter is not attracting the kind of early buzz the corporate beancounters in Hollywood like to see. Last week, a much-linked article made the rounds of the nerd-o-sphere, predicting that JC is going to be a tremendous flop. The kind of flop that costs people their careers, maybe even the kind of flop that brings down studios. The first line of the article went so far as to compare it to Ishtar, the reviled 1987 Warren Beatty-Dustin Hoffman vehicle that became the poster-child for overblown vanity projects practically overnight.

To put it succinctly, this article pissed me off.

spacer