Credidero #7: Community

If you were to create a word cloud of every document, article, letter, and email I’ve written during the past five years as President and CEO of a nonprofit organization serving the tree care industry, I suspect that after the obvious mission-related words — tree, forest, research, endowment, education, arborist, etc. —  the word that would show up most frequently would be “community.” I use it all the time, referring to the Tour des Trees as our primary community engagement event, discussing how our work helps the global tree care community, noting that our work focuses on the importance of urban and community forests, by promoting research designed to benefit whole communities of trees and related organisms (including humans), rather than individual specimens or species.

If you ran that same word cloud for the four years before I began leading the organization, you most likely would not see “community” ranked so highly in our communications. We used to refer to the Tour des Trees as our primary fundraising event, and we discussed how our work benefited the tree care industry, and how our efforts advanced arboriculture, with much of our research focused on individual plants, rather than their collectives. This change in language was not an organizational shift driven by some strategic planning decision, nor was it a modification to what we do and how we do it directed by our Board or emergent outside forces. It was frankly just me shaping the narrative about the organization I lead, and I how I want it to be perceived.

Calling the Tour des Trees a “fundraising event,” for example, misses the critical component of how we interact with people as we roll on our way throughout the week, providing education and outreach to help people understand our work and how it benefits them. Saying that we work for the “tree care industry” seems somehow crass and antiseptic to me, implying that the businesses are more important than the people who collectively engage in the hands-on work of caring for trees. “Urban forests” can be confusing to folks in its evocation of big city park spaces, even though street trees, yard trees and trees along utility rights of way in suburbs, exurbs, and rural spaces are also part of our mission’s purview. And thinking first of communities of trees, rather than individual plants, helps us better understand and communicate the exciting, emergent science exploring the ways that trees have evolved as communal organisms, sharing information and nutrients through root-based symbiotic networks.

I’d be fibbing if I said that I had purposefully made these and other related linguistic changes as part of an intentional, organized shift in tone. It just happened as I went along, and it honestly didn’t actively occur to me that I had done it in so many ways and places until I started thinking about this month’s Credidero article. But the changes are clearly there, evidence of the fact that it’s somehow deeply important to me, personally and professionally, that my organization acts and is perceived as part of something bigger and more connected than our relatively small physical, financial and personnel structure might otherwise dictate. I do believe that words have power, and if you say something often enough, and loudly enough, that people begin to perceive it as true, and then it actually becomes true, even if nothing has really changed except the word(s) we use to describe ourselves and our activities.

So why is “community” such an important and transformative word in my personal worldview? As I normally do in these articles when thinking about questions like that one, I looked into the word’s etymology: it comes to us English-speakers via the Old French comuneté, which in turn came from the Latin communitas, which ultimately boils down to something “shared in common.” But there’s a deeper layer in the Latin root that’s preserved to this day in cultural anthropology, where communitas refers to (per Wiki) “an unstructured state in which all members of a community are equal allowing them to share a common experience, usually through a rite of passage.”

The interesting corollary here, of course, is that those who do not or cannot participate in that rite of passage may neither partake of nor receive the benefits of communitas. Peter Gabriel’s “Not One Of Us” has long been one of my favorite songs, both musically (Gabriel, Robert Fripp, John Giblin and Jerry Marotta put in some sublime performances here) and lyrically, with one line standing out to me as a key bit of deep wisdom, writ large in its simplicity: “How can we be in, if there is no outside?” That deepest root of the word “community” captures that sense of exclusion: there’s a positive sense of belonging for those who have crossed the threshold for inclusion, while those who haven’t done so are (to again quote Mister Gabriel) “not one of us.”

So are many (most?) communities perhaps defined not so much by who they include, but rather by who they exclude? I suspect that may be the case. When I first took over the tree nonprofit, for example, I had a couple of early encounters and experiences where folks communicated to me, explicitly and implicitly, that they saw the organization not as a cooperative symbiote, but rather as predatory parasite, on the collective body of tree care professionals and their employers. I was also made to feel uncomfortable in a few situations by my lack of hands-on experience in professional tree care, including the fact that I had no certification, training, or credentialing as an arborist or an urban forester. I had not passed through the “rite of passage” that would have allowed me to partake of the tree peoples’ communitas, and so in the eyes of some members of that community I was (and probably still remain) on the outside, not the inside. So my push over the past four years for my organization to be an integral part of a larger professional community may be, if I’m honest and self-reflective, as much about making me feel included as it is about advancing the organization.

When I look bigger and broader beyond my tree-related job, I certainly still see a lot of that “inside/outside” paradigm when it comes to the ways in which we collectively organize ourselves into communities, locally, regionally, nationally, and globally, oftentimes along increasingly “tribal” political lines, e.g. Blue States vs Red States, Republicans vs Democrats, Wealthy vs Poor, Christian vs Muslim vs Jew, Liberal vs Conservative, Citizen vs Immigrant, Brexit vs Remain, etc. Not only do we self-sort online and in our reading and viewing habits, but increasingly more and more people are choosing to live, work, date, marry, and socialize only within circles of self-mirroring “insiders,” ever more deeply affirming our sense that the “others” are not like us, are not part of our communities, and may in some ways be less important, less interesting, less deserving, or even less human than we are.

That’s certainly the narrative being spun by our President right now through social media, spoken statements, and policy initiatives, as he seems adamantly opposed to “an unstructured state in which all members of a community are equal.” Which is dismaying, given the allegedly self-evident truths we define and hold in our Nation’s organizational documents, ostensibly designed to bind us as a community under the leadership of a duly-elected Executive, who is supposed to represent us all. That said, of course, we know that the infrastructure of our great national experiment was flawed from its inception in the ways that it branded some people as less than fully human, and some people as not qualified to participate in the democratic process, due to their skin color or their gender. I’d obviously like to think that we’re past those problems, some 250 years on, but the daily headlines we’re bombarded with indicate otherwise. Insiders will always need outsiders . . . and communities may often only feel good about themselves by feeling bad toward those they exclude. I suppose several thousand years of history show that may well be a core part of what we are as human beings (I explored that theme more in the Inhumanity Credidero article), and that maybe aspiring to create positive communities of inclusion may be one of the nobler acts that we can pursue.

I’m stating the obvious in noting that the ways we can and do build community, for better or for worse, have radically changed over the past 25 years or so with the emergence of the world wide web and the transformations made possible by it. If you’d asked me to describe what “community” meant to me before 1993, when I first got online, I’d likely have focused on neighborhoods, or churches, or fraternal organizations or such like. I’d say that within less than a year of my first forays into the internet’s kooky series of tubes, though, I was already thinking of and using the word “community” to refer to folks I romped and stomped with online, most of whom I’d never met, nor ever would meet, “in real life.”

I wasn’t alone, as the word “community”  has became ever-more widely and casually used over the years to describe clusters of physically remote individuals interacting collectively online, via an ever-evolving spectrum of technological applications, from ARPANET to the World Wide Web, from bulletin boards to LISTSERVs, from mailing lists to MMORPGs, from blogs to tweets, and from Cyber-Yugoslavia to Six Degrees to Friendster to Orkut to Xanga to Myspace to LinkedIn to Facebook to Twitter to Instagram to whatever the next killer community-building app might be.  I actually wrote a piece about this topic ten years or so ago for the Chapel + Cultural Center‘s newsletter, and at the time I used the following themes and rubrics to frame what community meant to me:

  • An organized group of individuals;
  • Resident in a specific locality;
  • Interdependent and interacting within a particular environment;
  • Defined by social, religious, occupational, ethnic or other discrete considerations;
  • Sharing common interests;
  • Of common cultural or historical heritage;
  • Sharing governance, laws and values;
  • Perceived or perceiving itself as distinct in some way from the larger society in which it exists.

And I think I stand by that today, noting that a “specific locality” or “a particular environment” may be defined by virtual boundaries, rather than physical or geographical ones. But then other elements embedded within those defining traits raise more difficult questions and considerations, including (but not limited to):

  • What, exactly, is an individual in a world where identity is mutable? Is a lurker who never comments a member of a community? Is a sockpuppet a member of a community? Are anonymous posters members of a community? If a person plays in an online role-playing game as three different characters, is he one or three members of the community?
  • How are culture and historical heritage defined in a world where a six-month old post or product is considered ancient? Do technical platforms (e.g. WordPress vs. Twitter vs. Instagram, etc.) define culture? Does history outside of the online community count toward defining said community?
  • What constitutes shared governance online? Who elects or appoints those who govern, however loosely, and does it matter whether they are paid or not for their service to the group? What are their powers? Are those powers fairly and equitably enforced, and what are the ramifications and consequences when they are not? Is a virtual dictatorship a community?

I opined then, and I still believe, that there is a fundamental flaw with online communities in that virtual gatherings cannot fully replicate physical gatherings, as their impacts are limited to but two senses: sight and sound. While these two senses are clearly those most closely associated with “higher” intellectual function, learning and spirituality, the physical act of gathering or meeting in the flesh is much richer, as it combines those cerebral perceptive elements with the deeper, more primal, brain stem responses that we have to taste, touch and smell stimuli. While I’m sure that developers and designers and scientists are working to figure out ways to bring the other three senses into meaningful play in the digital world, a quarter century after I first got online, there’s been no meaningful public progress on that front, and I am not sure that I expect it in the next quarter century either.

Image resolution and visual interactivity get better and better (especially on the virtual reality front), while sound quality actually seems to get worse and worse over time, when we compare ear buds and “loudness war” mixes to the warm analog glory days of tube amps and box speakers — but that’s it, really. And as long as we are existing digitally in only two senses, exchanging messages online removes any ability to experience the physical reality of actually touching another person, be it through a hand-shake, a kiss, a squeeze of the arm or a pat on the back.  The nuances of facial expression and inflection are lost in e-mails and texts, often leading to confusion or alarm where none was required or intended. There is no ability to taste and feel the texture of the food we discuss in a chat room. It still seems to me that the physical act of community building is a visceral one that appeals to, and perhaps requires, all of our senses, not just those that can be compressed into two-dimensions on our computer screens.

I still believe that two-dimensional communities are, ultimately, destined to disappoint us sooner or later for precisely that reason. I have certainly spent countless interesting hours within them — but if you plotted a curve over time, my engagement grows smaller by the year. While people often compare the dawn of the Internet era to the dawn of the printing press era, it’s important to note that the earlier cataclysmic shift in the way that information was preserved and presented (from spoken word to widely-available printed material) did not result in the elimination of physical gatherings, upon which all of our innate senses of community have been defined and built for centuries, as has been the case in the Internet era. Communication happens more readily now, for sure, and communities may be built almost instantaneously, but they’re not likely to have all of the lasting resonances that their traditional in-person counterparts might offer.

I note, of course, that my feelings on this topic are no doubt influenced by the fact that my adulthood straddles the pre-Internet and post-Internet divide. I was raised without computers and cell phones and instantaneous access to pretty much anybody I wanted to connect with, anywhere in the world, so my communities couldn’t be accessed or engaged while sitting alone in my bedroom. I don’t know how many people have been born since 1993, but many (most?) of them, having been fully raised in the digital world, may not be wired (no pun intended) to feel that distinction. And when I continue to make that distinction, they likely see me in the ways that I once would have perceived a grouchy old man shaking his fist and shouting “Get off my lawn, you kids!”

Generational issues aside, I do think that some of the uglier aspects of online communities — bullying, hateful echo chambers, exploitation of weaker members, cruelty hidden behind anonymity — are blights on our culture and our souls, and are having direct cause-effect impacts on the nastiness of our modern social and political discourse. If Twitter and Facebook and other social media sites were shutdown tomorrow, a lot of online communities would cease to exist, sure, but the impact of that global loss of connection would not necessarily be a net negative one. But the genie’s out of the bottle on that front, I suppose, as barring a full-scale catastrophic failure of the global communication network, communities (ugly and beautiful alike) will just emerge in new virtual spaces, rather than those billions of people returning en masse to traditional, in-person community building.

But some of them might. And I might be one of them. I’ve written here before about being “longtime online” and often a very early adopter of new platforms and technologies as they’ve emerged. But somewhere in the past decade or so, I stopped making leaps forward in the ways that I communicate with people and engage with communities online. The next thrilling app just sort of stopped being any more thrilling than the one I was already using, so inertia won out. I bailed on Facebook around 2012, and have used Twitter almost exclusively to communicate online (outside of this blog) between then and last month, when I decided to let that go too.

Beyond social media, I have had several online forum-based communities in which I was very active over the years (Xnet2, Upstate Wasted/Ether [defunct], The Collider Board [defunct], The Fall Online Forum, etc.), and those have mostly fallen by the wayside as well. I’ve retained some very meaningful communications with some good friends from those spaces via email and occasional in-person meetings, but it’s one-on-one connection between us for the most part, and not dialog and group play unfolding in public before the rest of the community. And, again, I think I’m finding it easy to walk away from those public communities, for the most part, because the personal depth of the connections I’ve made gets shallower as the years go on, and even some of the long-term connections just sort of run their courses and stagnate, because there’s really no organic way for the relationships to grow or advance in any meaningful way.

Maybe again this is just a me-getting-older issue, but I get more richness of experience within my communities that exist in real space, and real time, than I used to, and I get less from my online connections. A desire to move more toward that probably played some psychological part in how hard I pushed the word “community” in my professional life, trying to build one there, not only through online means, but also through the scores of conferences that I’ve attended over the years, with tree care scientists and practitioners from around the world. That is a good community. I believe that I have improved my organization’s standing within it. And that feels good.

Part of the cost of doing that, though, was really failing to become part of any meaningful real-world community where I actually lived in Chicago, and also being separated from the little community that means the most to me: my family. A big part of my decision to retire this year was the desire to get that priority inequity better aligned, and I think that as we look forward to our next move as a family, whenever and wherever it is, I’ll be more inclined to put the effort in to make new community connections there, rather than just hanging out on the computer chatting about arcane subjects with what Marcia fondly refers to as my “imaginary friends.”

One of my personal goals for the Credidero (reminder: it means “I will have believed”) project was to spend a month or so considering and researching a given topic, and then exploring how I felt about it, not just what I thought about it, to see if there were some new insights or perspectives for me, perhaps as articles of faith, or different lenses through which to view my world going forward. Somewhat ironically, this month’s “community” topic has been the hardest for me to consider and write, almost entirely because I’ve already spent so much time thinking about it and writing about it over the years that I already have a stronger set of well-formed beliefs on the topic that I’ve had on any of the others thus far.

How I act on those beliefs, though, I think is evolving, hopefully in ways that connect me more meaningfully with a more local or in-person communities, rather than spending so much time alone (in real life) while sort of together (in virtual space). I imagine that retirement, with all the newly available time it entails, will be a much richer experience that way. Less thinking and writing about community all by myself, and more experiencing community with others.

And on that note, I think I’m going to go sit out by the pool and see if there’s anybody to talk to . . .

A community of tree people and cyclists. More fun in person than online!

Note: This article is part of an ongoing twelve-part writing project. I’m using a random online dice roller to select a monthly topic from a series of twelve pre-selected themes. With this seventh article complete, I roll the die again . . .

. . . and next month I will consider Topic Number Four: “Complexity”

All Articles In This Series:

Credidero: A Writing Project

Credidero #1: Hostility

Credidero #2: Curiosity

Credidero #3: Security

Credidero #4: Absurdity

Credidero #5: Inhumanity

Credidero #6: Creativity

Credidero #7: Community

Credidero #8: Complexity

Credidero #9: Eternity

Credidero #10: Authority

Credidero #11: Mortality

Credidero #12: Possibility

Credidero: An Epilogue

 

Credidero #6: Creativity

When I was in First Grade, our class was introduced to haiku, the classic Japanese form of short poetry structured in three lines, of five, seven and five syllables each. After our teacher explained the concept, read us some examples, and showed how to count syllables (do they still do the thing where you put your hand under your chin while enunciating words?), we were told to take out a sheet of paper (this stuff, remember?) and pen our first haiku. I thought for a few minutes, dutifully jotted one down, counted to make sure the syllables worked out, and handed in my work.

Some time later that day, I was summoned to speak with our teacher, and there were a couple of other grownups behind her desk, though I did not know who they were. Uh oh! Trouble! The teacher asked me where I got the poem I had submitted. I told her I wrote it. She nodded and asked if I wrote it by copying it out of a library book. I said no, I just thought it up and wrote it down. She kept pressing me on other possible sources for the little poem, and I kept telling her that I made it up, following her instructions. She then asked me to define a specific word within the poem, and to explain a specific phrase. I answered her as best I could.

I do not recall or have that entire poem intact anymore, alas, but I do remember that the issues that led my teacher to call in the guidance police and interrogate me on whether I had actually written it or not were my use of the word “ere,” the elegance of my alliterative phrase “gray-green grass,” and the fact that I inadvertently did exactly what a haiku was really supposed to do — juxtaposing images around a seasonal reference — while most of my classmates were writing straight literal odes to their cats or toys or mothers. Clearly, no little country cracker from South Cackalacky should have been able to do that, right?

The thing that finally convinced the teacher and the school-yard Stasi agents that I had not somehow plagiarized the poem was the fact that when I was asked to read it aloud, I pronounced the word “ere” as “err-eh,” (I had only seen it in my favorite poetry book, but had never heard it spoken), so my middle syllable count was actually off by one.  So apparently, I really hadn’t been carrying around a little Bashō masterpiece in my Speed Racer lunchbox just waiting to spring it on someone for my own gain at an opportune moment. I was dismissed and off to recess I went, a bit concerned and confused.

Some time soon thereafter, I was promoted directly from First Grade to Third Grade,  branded smart beyond my limited years, at least in part because of that little poem. I learned three key things from this confluence of events:

  1. Being creative can cause trouble
  2. Being creative can open doors to opportunity
  3. People correlate artistic creativity with full spectrum intelligence

I also picked the Beloved Royals to win the 1978 World Series. Wrong again.

I have never stopped writing since then, for pleasure and for pay. My first real job was as the “Teen Editor” of the Mitchel News when I was 13 years old, per the photo at left. I was supposed to be a beat reporter, doing interviews with new kids and documenting the things in which the young people on the military base where I lived were presumed to be interested. But after a column or two like that, I got bored, and I started doing record reviews of Jethro Tull and Steely Dan albums instead, or handicapping sports seasons (Remember the NASL? I went out on a limb and predicted that my beloved Washington Diplomats would win the 1978 title. I was wrong), or writing creepy poetry about some of the weird old buildings and spaces on the base. I also gave the way-popular movie Grease a thumbs-down, one-star, bomb review, eliciting howls of rage from most every girl I knew on base, and the guys who wanted to impress them. That might have been the bridge too far.

I was fired from the job after about a year, because my creative voice was not the creative voice that the newspaper’s publisher desired me to have, or at least to create, for his readers. So I learned a lesson there, too: creative writing and “technical writing” (for lack of a better term) were not the same thing in the eyes of publishers and readers, and there was a difference in the things that I wrote for myself and the things that I wrote for others. (Well, I also learned, much to my chagrin, that my cushy desk job writing stuff was a whole lot easier than cutting grass, delivering newspapers, clearing brush, or washing dishes, all of which I had to do at one time or another to earn my scratch after I was chucked out at the press room, alas.)

That distinction between creative and “technical” writing is true to this day, of course, although then and now, there’s always been one thing that sort of sticks in my craw when I actively ponder it, and it’s the fact that I do not perceive either form of writing as requiring any more of less creativity than the other, though only gets called “creative.” One type might require research, one type might not. One type might require the creation of characters, one type might require creatively describing real characters. One type might require me to write in dialog or dialect, creating a voice for other people to speak, one type might require me to speak in a voice other than my own to appeal to the whims and tastes of the people or organizations who pay me for my efforts. But they all require creativity.

I enjoy both types of writing, in different ways, and in different times. When I sit down with a blank page or a blank screen before me, and I get up some time later leaving a big blob of words behind, I feel that I have created something. It may be a newsletter column, it may be a poem, it may be a story, or it may be a Credidero article. All equal in my mind in terms of their worth (though I know from experience that in the eyes of those who pay for my blank paper and computer screen, my “technical” writing is of far greater value), all requiring the same basic sets of linguistic skills, all involving creativity. I have to start with an idea, I have to formulate structures around the idea, I have to put the idea into a meaningful context, I have to deploy resources to ensure the idea is valid, I have to find the right words to share the idea with others. When that’s done, I have made something, from nothing. Voila, creation!

Interestingly enough, this concept that humans can be creative and practice creativity is actually a shockingly modern one in Western Culture. In Plato’s Republic, the wise old Greek philosopher is asked “Will we say, of a painter, that he makes something?” and he replies “Certainly not, he merely imitates.” For centuries, the widely-held philosophical belief was that only God could create something from nothing — creatio ex nihilo — while we mere mortals were stuck with imitating, or discovering, or making, or producing, oftentimes guided by inhuman forces, like Muses, or Daemons, but certainly not creators in our own right. Poetry was one of the first arts to be granted license to be seen as an act of creation, with other written texts, and other arts genres following in the 19th Century. It was only in the early 20th Century when Western people begin to speak of “creativity” in the natural sciences and technical fields. So whether they realize it or not, the creative folks today who (somewhat pretentiously, to these ears) call themselves “makers” instead “artists” are actually invoking a 2,000-year old paradigm that (again, to these ears) devalues the fruits of their labors, rather than (as likely intended) anchoring them as somehow more “real” or pragmatic that the stuff that the arsty-fartsy types produce.

I find the long and wide-spread cultural reluctance to embrace and value creativity as a deeply personal, deeply human, deeply individual endeavor or trait to be fascinatingly awry with own beliefs and experiences on this front. When I interview for jobs or freelance assignments, or even in just regular conversations (because I’m like that) and I am asked what I consider to be my greatest skill, I always default to “communications” — I am a story-teller, at bottom line, and I can make people believe in my organizations’ work and mission, by talking to people, and by writing to people. If you hire me, I’ll do the same for your organization!

I create narratives on the job, sometimes from scratch, sometimes from consolidation of extant texts, but there’s a deep creative element to both of those activities. I also tell stories on my own time, fictional ones, poetic ones, accounts of events yet to come, documentation of events gone by. A concert review is a story. A fundraising appeal is a story. A speech is a story. An explanation of a research finding is a story. I am a story-teller. And I would not be a story-teller if I did not possess a finely tuned sense of creativity, and a desire and skill to create something that did not exist until I turned my mind to it. I can’t imagine my mind, in fact, if it was not anchored in that desire to create stories. I have done lots of creative things in various creative fields over the years (a bit more on that later), but my stories and my writing are as intrinsically me, myself and I as anything else I do.

I generally think of this a good thing, though research may indicate otherwise. A study involving more than one millions subjects by Sweden’s Karolinska Institute, reported by the Journal of Psychiatric Research in 2012, found that, as a group, those in the creative professions were “no more likely to suffer from psychiatric disorders than other people, although they were more likely to have a close relative with a disorder, including anorexia and, to some extent, autism.” Phew! But wait . . . within that surveyed cohort of creative types, the researchers did find that writers (!) had a “higher risk of anxiety and bipolar disorders, schizophrenia, unipolar depression, and substance abuse, and were almost twice as likely as the general population to kill themselves” (!!) and that dancers and photographers were also more likely to have bipolar disorder.

Well, uhhh . . . yeah. That. Kettle, pot. Pot, kettle. We writers are apparently the most damaged creatives, the Vincent van Goghs and Roky Ericksons and Francisco Goyas and Nick Drakes of the world notwithstanding. For the record, if I look back at the my family tree just for a couple of generations, and across a couple of close cousin branches, every single one of those disorders appears, often multiple times. So is the creative drive that I think of as a gift or a blessing actually just a side symptom of a spectrum of mental illnesses passed on to me as part of my innate genetic code?

Maybe. The suffering artist stereotype probably didn’t emerge without evidence, after all, and when I think about the periods in my life when I was most floridly, obsessively creative (not just in writing), they probably do correlate closely with the periods when I felt the most vulnerable or damaged. Being driven can be a good thing. And being driven can be a bad thing. Beautiful stories can emerge from dark spaces, and dark narratives can emerge from a happy place. Keeping busy and being productive can be cheap forms of self-administered therapy, or they can be positive manifestations of a mind well, truly, and happily engaged.

This way madness lies.

I think of big part of managing my own creativity is being self-aware enough, through long experience, to know which of these conditions are in play at any given time, and to create accordingly. In the 1990s, to cite but one time-place-topic example, I wrote a novel, called Eponymous. It was about a dissolute musician-writer named Collie Hay, beset with a variety of substance abuse issues and mental health problems, written in the first person, explaining why and how the writer wrote, and knitting a narrative about how creation and destruction often go hand in hand. The epigraph of the book, the very first words you read when you open it, say “For the Demons . . . Fly away! Be free!” The very last page of the book, in the author’s bio, says “J. Eric Smith is not Collie Hay. Honest.” But, uhhh, y’know. Of course, I’d say that. I’m a writer, and not to be trusted.

One review by a trusted colleague who was also a damaged writer type noted “Eponymous is a hall of mirrors. J. Eric Smith, the author, who’s been an upstate New York rock critic, has written a book about an upstate New York rock critic who is himself writing a book. The book-within-a-book device is hard to pull off, but when it works (see Tristram Shandy and Adaptation) — and it works well here — it’s lots of fun.” And it starkly lays bare the correlations, at least in my case and in my mind, between written creativity and dysfunction, without even really bothering to explain them, since they just seem(ed) to me to go hand in hand, as a given. It also felt good to write, and some demons did, in fact, flitter off as result of having written it. Poof! Therapy!

(Side Note #1: If you want to read Eponymous — and 20+ years on from creating it, I’m not sure I’d really recommend it to you — find a cheap, used print version of it, not the Kindle version. It was adapted as an e-book without my permission or supervision, and a lot of the internal formatting [there’s poems, lyrics, other stuff of that ilk] got messed up and is very difficult to read on an e-book reader. I don’t make a penny from it either way, so it’s your call if you want to score it, but I just don’t want you to have to struggle with a nasty mess of on-screen gobbledygook if you do wade into the thing).

While I’ve focused my personal observations here on writing, I should note that I have been and remain creative in other ways too. I’d claim photography as the visual art form in which I have the greatest interest and skill, and I’ve been a songwriter, singer, musician and lyricist over the years as well, though mainly in my younger days. Until precisely 1993, in fact, I would have cited music as my primary creative skill, and the one which I was most willing, able and likely to achieve success (critical, if not commercial) over the years.

How can I date that end point so accurately? That was the year when we got our first home computer and I got online for the first time. My home recording studio began gathering dust soon thereafter. For a long time after that when people would ask about how my music was going, I’d say “I’m in remission as a musician these days,” so once again, even way back then, I was lightly associating creativity with illness, even if I laughed it off in so doing. But we all know that every joke has a bit of truth behind it.

(Side Note #2: If I could snap my fingers and have any job in the world, hey presto, right now, I would want to be the principal, non-performing lyricist for a commercially and critically successful musical act. The Bernie Taupins, Roberts Hunters, Peter Sinfields, and Keith Reids of the world have got good gigs! I have had the pleasure of having my poems recorded as lyrics by a few musicians, and it’s deeply satisfying, let me tell you. That’s likely to be one of the areas I’m going to focus creative attention on when I retire from my last day job accordingly. Maybe the group or singer that hires me would let me provide the photographs for the album covers and promotional materials too. A fellow can dream, right? Even a broken creative fellow?)

So creativity has touched and shaped my life in a variety of ways that fall under the “artistic” umbrella, but the amount of time I spend on those pursuits pales in comparison to the amount of time I spend at my job. Sure, writing and speaking and story-telling are cornerstones to that, and as noted above, and I feel that those facets of my professional work are every bit as anchored in my core, latent sense of creativity as are my most absurd and fantastic pieces of fiction and poetry. But just as the concept of creativity evolved and was adapted in the early 20th Century to include the sciences and technical endeavors, the latter part of the Century saw the definitions expanding further into organizational, operational, and business dynamics, and the ways that groups use creativity to build and sustain corporate culture.

Look at tech behemoths like Apple, Microsoft, Amazon, Google and Netflix, just off the cuff. Sure, there were certainly some blindingly creative people within their corporate structures who made the technical products and business services they provide, but they would not be what they are today without other visionaries imagining, designing, and implementing completely new business models and marketing approaches unlike anything seen or experienced before them. The brains behind those new business models were certainly engaged in forms of creativity, making new and valuable things (many of them concepts, not stuff), filling market spaces that nobody knew existed before they dreamed them up and monetized them. If you had told me when I was a teenager that by my 55th birthday I’d be able to listen to any song in the world while watching a movie, doing my banking, typing to a pen pal in Australia, and playing a video game, on my phone, at the beach, all at the same time, I’d have said you were watching way too much Jetsons for your own good. That’s just silly, Astro.

The transformative nature of the tech sector means that much of the recent and current research into and writing about creativity in the workplace focuses on organizations and individuals within the Silicon Valley sphere of companies, because the cause-effect-impact relationships there are easy to identify, evaluate and explain. But the work that many of us do in non-technical sectors can involve just as much creativity, and can have just as transformative an impact within our organizations, or within the smaller external spheres in which we operate. I’m confident that 100 years from now, the types of activities that are granted “creative” status by default will expand to include countless more fields and activities, many of which are unknowable or inconceivable today, even in the creative minds of the most brilliant futurists.

But maybe we shouldn’t wait 100 years to afford “creative” status to certain endeavors that aren’t seen as “earning” it today. We’re all creative, each in our own ways, every time we produce something that wasn’t there before we cast our hands above the waters and said (to ourselves) “Let there be a thing,” whether anybody else knew we did it or not, whether it had any use or value at all to anybody, whether it could be experienced in the world of the senses, or only within the spheres of our minds. We may create alone or with others. We may create to heal ourselves or hurt ourselves, others likewise. It may feel good, or it may feel bad. We may intend to create, or we may create by happy accident. It’s all the same: “Let there be a thing,” and there will be, and sometimes it might even be really, really good.

Creatio ex nihilo was long the sole province of God, or the Gods, or Muses, or Daemons, or other inhuman forces swirling in the vapors around us. I believe that by claiming creativity as our own human right, in all the things we do, and celebrating its fruits, we don’t denigrate the God(s) that inspire us, but instead become ever more like them.

Note: This article is part of an ongoing twelve-part writing project. I’m using a random online dice roller to select a monthly topic from a series of twelve pre-selected themes. With this sixth article complete, I roll the die again . . .

. . . and next month I will consider Topic Number One: “Community”

Deftly using every single one of my creative skills here in coherently explaining the recorded canon of the great UK band Wire. It made sense at the time . . .

All Articles In This Series:

Credidero: A Writing Project

Credidero #1: Hostility

Credidero #2: Curiosity

Credidero #3: Security

Credidero #4: Absurdity

Credidero #5: Inhumanity

Credidero #6: Creativity

Credidero #7: Community

Credidero #8: Complexity

Credidero #9: Eternity

Credidero #10: Authority

Credidero #11: Mortality

Credidero #12: Possibility

Credidero: An Epilogue

 

 

Credidero #5: Inhumanity

When I was framing my Credidero writing project for 2019, I spent a fair amount of time and thought developing the list of 12 one-word topics that I intended to ponder and write about over course of the year. I clearly remember that the very last edit that I made to the final list was turning the possible topic “Humanity” into the actual topic “Inhumanity.” At the time I did that, the edit was largely to eliminate what I thought could have just produced a fuzzy wuzzy glurgey article, and to better balance what I considered innately “positive” reflections among the list of 12 with what I thought could be innately “negative” reflections, and thereby perhaps more intellectually difficult to parse. In other words, it seemed to me at the time that it would be easier for me to fall into writing piffle and tripe about the concept of “humanity” (“La la la, let’s all get along, people are great, yay!”) than it would be to consider its opposite, which could far more fraught with dark feelings and unpleasant realities and difficult realizations. Those seemed to be the spaces that would be more challenging for me to explore, and since this project was intended to be a personal challenge, that final edit stuck.

My sense of that potential challenge has proven accurate for me over the past month, and this is the first of five scheduled articles to date where I’ve missed my self-imposed monthly deadline (by just a few days, though) as I knocked the topic around in my brain housing group a bit longer than I have earlier installments before finally sitting down to organize my mental noise and write. One of the key difficulties for me has been that this topic is ultimately defined by its negation: you can’t actually consider or define “inhumanity” without considering and defining “humanity” first: basically and etymologically speaking, “inhumanity” is simply the absence of whatever its opposite is. Then, adding complexity, “humanity” itself carries two simple/common uses, one of a noun form (“the human race; human beings collectively”), and one of an adjective form (“humaneness; goodness; benevolence”).

But here’s the rub: by most objective measures, on an aggregate, macro, long-term, global basis, humanity (noun) does not practice humanity (adjective) very effectively, at all. Our history is shaped, defined and recorded not through long eras of benevolence and kindness and care, but rather through endless, constant, unrelenting war, subjugation (of our own species and others), depredation of resources, conquest and assimilation of assets, and a host of other day-to-day and century-to-century activities that skew far from the concepts of compassion, tolerance, goodness, pity, piety, charity and care that are embodied in the common adjectival use of the word “humanity.”

It’s almost like humanity (noun) hired some supernatural marketing agent to spin the emergence of the English word humanity (adjective) in the 14th Century just to make us look good and gloss over the horrors imminent whenever we rolled up into your forest or savanna or jungle or oasis. “Oh hey, look, it’s the Humans! Word is, they are kind and benevolent! Humanity, Huttah!” (Said the Lemur King, before we turned him into a shawl, and then killed all the other humans who didn’t have any Lemur Wear).

I kept knocking this conundrum around, not really getting anywhere with it, until it occurred to me that maybe I needed to parse the word “inhumanity” a bit differently. In its common modern form and usage, we think of it in these two parts: “in + humanity,” i.e. the absence of humanity (adjective), and we generally take the combined word form to be a bad thing, even though “humanity” (noun) is pretty awful, if we’re frank about our shortcomings. Perhaps a better way to consider the subject word, though, is to parse it thusly: “inhuman + ity,” i.e. simply the state of being not human. Plants exist in a state of inhumanity, when defined that way, and there’s no value judgment assigned to saying that. They are not human. Fact. Likewise all of the other gazillions of species of animals, protists, fungi, bacteria, viruses, and unseen and unknown entities (ghosts? angels? demons? gods?) that share our planet with us. Not human, and all existing in a state of inhuman-ity accordingly.

The collective set of inhuman-ity engages in many of the same practices that humanity (noun) does: its species and members eat each other, cause diseases, go places they shouldn’t, do harm, deplete things, despoil things, over-populate, over-hunt, over-grow, overthrow, fight, bite, copulate, propagate, locomote, respirate, expire. But their collective abilities to break things on a truly planetary basis, and their willful, ongoing, compulsive drives to engage in the mass exterminations of creatures of their own kind in pursuit of meaningless shiny things or mythical mental constructions or physical homogeneity all pale in comparison to the wide-spread horrors which Homo sapiens is capable of, and seemingly revels in.

Should some miracle suddenly and completely remove the biomass of human beings from our planet today, the other living and unliving things upon it would likely, in time, return to some slowly evolving state of punctuated equilibrium that would allow life to continue in perpetuity here, occasionally blasted and reorganized by comets or asteroids or other stellar detritus, until the time, billions of years from now, when our star explodes and incinerates the little rock we all call our home for the final time.

But I believe it would challenge the most optimistic human futurists to consider the trend lines of what our own species has wreaked upon our planet since we emerged from East Africa, and imagine a (mostly) peaceful progression where humanity (noun) practices care, compassion, kindness, and goodness in ways that promote planetary well-being, human dignity, respect for all species, equality, justice, brotherhood, and the peaceful distribution of assets to all who need them for billions and billions of years.

The accepted scientific consensus, in fact, is quite the opposite of that: the actions that a relatively small number of human beings in positions of political and economic power are taking, right now, largely for the short-term material gain of their own cliques and cabals, are forging an irreversible glide path toward long-term global human suffering that will be orders of magnitude greater than any experienced in the history of our species, and that will more than offset the benefits of centuries of scientific gains, e.g. disease mitigation, crop yield and nutritional density, etc. It’s bad, and it’s going to get worse, fast, and it’s going to take down a huge percentage of the collective set of inhuman-ity on its way. Our current government, to cite but one illustrative example, doesn’t even want its scientists to publicize climate forecasts beyond the year 2040, because the list of probable futures are so dire beyond that point. But they’re coming, whether they write about them or not.

Where does any classical sense of humanity (adjective) as a state of goodness and grace fit within that reality, and how does one reasonably envision the human species a thousand years or more years from now as anything but, at likely best, a small rump tribe of survivors holding on meanly after the vast majority of our species has perished, horribly?

That may be the inevitable progress of our unique species, and it is inherently human accordingly, but it’s certainly not inherently humane. So the linkage between those two uses of the word “humanity” grows more and more difficult for me the longer I think about them, because it really seems to me that, on many planes, humanity is at its most humane when it is being its most in-human, and humanity is at its most inhumane when it is being its most human. Oh, the humanity! Look how inhumane it is!

I suspect alien interplanetary observers would come to the same conclusion, and then might want to hire that supernatural 14th Century marketing firm themselves before they head onto their next assignations: “Oh, hey, look, it’s the Aliens! Word is, they are decent and good and fair, despite their different colored skin and hair! Aliens, huttah!” I mean, we English speakers are just really full of ourselves and bursting with linguistic bullshit when we use the very word with which we name ourselves as a synonym for all the goodness in the world, right? It boggles this already boggled mind.

Let’s pause and take a deep breath at this point. I certainly appreciate that this is high-minded rant that I’m embarking on here, and I certainly do not wish to imply that I am any better (or worse)(or different) that the rest of humanity, by any stretch of the imagination. While I may not be one of the 100 Human Beings Most Responsble for Destroying Our Planet in the Name of Profit (if you click no other link here, click that one, please), there’s still plastic in my trash can, and dead animal parts in my refrigerator, and hatreds in my heart, so I would not set myself up as any sort of paragon of the ways in which human beings can, actually, be and do good. I’m doing my part to destroy the planet too, whether I want to or not, because I am human, and that is what we do, collectively.

But even in the face of our unrelenting, unstoppable destructive natures as a collective, there are individuals around us who do good, and act benevolent, and show kindness, and practice care, representing that classical 14th Century marketing sense of the word “humanity” in their everyday lives and activities. There might even be some small groups of people who can do that on a consistent basis over time, but I think that it is an inherent flaw in our in species that when you put too many of us together, by choice or by circumstance, we become inhumane to all those beyond the scopes and spheres where our individual perception allows us to see individual goodness shining more brightly than the collective awfulness to which we inevitably succumb. Jesus’ teachings were sublime and profound, to cite but one of many examples. But most churches that attempt to teach and (worse) enforce them today are horrible and cruel, with the largest ones usually being the most egregious, inhumane offenders.

How many humans do you have to put together into pockets of humanity before our innate lack of humaneness emerges? I would suspect the number aligns with the size of the average hunter-gatherer communities before we learned to write and record stories about ourselves, and then justify why one communities’ stories were better or more correct than its neighbors’ were, and I suspect that in our twilight years as a species, after we’ve mostly destroyed the planet, we’ll rediscover that number again as we cluster in small groups, optimistically in our ancestral savannas, but more likely in the wreckage of our great cities, after they have spread to cover the inhabitable surface of the Earth, and implode upon themselves.

I found close resonances in my emergent thinking on this topic in the writings of Sixteenth Century French philosopher Michael de Montaigne, most especially in his essay “On Cruelty,” where he wrote:

Those natures that are sanguinary towards beasts discover a natural proneness to cruelty. After they had accustomed themselves at Rome to spectacles of the slaughter of animals, they proceeded to those of the slaughter of men, of gladiators. Nature has herself, I fear, imprinted in man a kind of instinct to inhumanity.

Along similar lines, St Anselm of Canterbury has served as a sort of a quiet mascot for me in this series, since I indirectly captured the word “credidero” from his writings, and I’ve been researching some of his major works in parallel with this project. In De Casu Diaboli (“On The Devil’s Fall,” circa 1085), Anselm  argued that there are two forms of good — “justice” and “benefit” —and two forms of evil — “injustice” and “harm.” All rational beings (note that Anselm is including the inhuman angelic caste in this discourse) seek benefit and shun harm on their own account, but independent choice permits them to abandon justice. Some angels chose their own happiness in preference to justice and were punished by God for their injustice with less happiness. We know them now as devils. The angels who upheld justice before their own happiness, on the other hand, were rewarded by God with such happiness that they are now incapable of sin, there being no happiness left for them to seek in opposition to the bounds of justice. Poor humanity (noun), meanwhile, retained the theoretical capacity to choose justice over benefit, but, because of our collective fall from grace, we are incapable of doing so in practice except by God’s divine grace, via the satisfaction theory of atonement.

At bottom line, then, Anselm ultimately found humanity collectively damaged, closer in temperament to devils than angels, and salvageable only by the intervention of the humane, though in-human, God of Abraham, and his Son, who became human, so that other humans could kill him. As humans do.

These two quotes eventually carried me back to the very first thing I typed on this ever-growing page over a month ago, and likely the very first thought that you as a reader had, when presented with the word “inhumanity:” the oft-stated concept of “man’s inhumanity to man,” which has become something of a cliche through over-use. Do you know where the phrase comes from? I didn’t, though I guessed it was likely Shakespeare, since so many eloquent turns of phrase of that ubiquity in our language come from his works.

My guess was wrong, though: it was first documented in English in Man Was Made to Mourn: A Dirge (1784) by Robert Burns:

Many and sharp the num’rous ills
Inwoven with our frame!
More pointed still we make ourselves
Regret, remorse, and shame!
And man, whose heav’n-erected face
The smiles of love adorn, –
Man’s inhumanity to man
Makes countless thousands mourn!

Burns, too, accepts at face value the inherent awfulness “inwoven with our frame,” and notes that through our active choices and actions, we’re more than capable of becoming ever more awful yet.

Once again, we return to to the linguistic twist: humanity is at its most humane when it is being its most in-human, and humanity is at its most inhumane when it is being its most human. Can we extrapolate a statement of action from that conundrum thusly: the only way we will save humanity is to reject humanity, and the only way we will be humane is by being inhuman?

It seems and sounds absurd to frame an argument in those terms on the surface, and yet . . . human beings readily embrace the inhuman when they pray to God and ask his Son to come live in their hearts, guaranteeing their eternal salvation, and human beings embrace the inhuman when they accept a Gaia concept of the planet as a single living entity upon which we are analogous to a particularly noxious strain of mold on an orange, and human beings embrace the inhuman when we look to the cosmos around us in the hopes that we are not alone, and that whoever or whatever is out there might yet save us.

I could rattle off dozens of other examples of the ways in which humans embrace the inhuman in the hopes becoming more humane, and all of them carry more than a whiff of deus ex machina about them, as they all involve elements from outside a system being injected into a system to sustain a system. But you know what? That feels right to me. That feels like the one viable path out of an endless do-loop of inhumane humanity, and I suspect that’s why all cultures, throughout our history, have created stories and religions and narratives that seek to guide humanity’s future through the examples of non-human actors, be they other living things on our planet, or mystical beings beyond it.

I doubt that any one of them is any better or any worse than any other, so long as they focus individuals and small groups (remember, we get horrible en masse, always) on goodness at a scale perceivable to the perceiver, and receivable by a receiver. Maybe this explains why I feel compelled to speak out loud to animals when I meet them on my perambulations, as just a small personal act of embracing the inhuman around me, perhaps creating feelings and moods and resonances that might then make me a better human being to the other human beings with whom I interact. Maybe Marcia and Katelin embrace the inhuman in similar ways through their yoga practice. Maybe you embrace the inhuman in a church, or a temple, or a mosque, or in a forest meadow, or atop a mountain, or beneath the eyepiece of a massive telescope.

Maybe we all become better, more humane humans, the more we embrace the inhuman-ity around us. It’s a squishy proposition, sure, but my gut tells me it’s the right one . . .

Note: This article is part of an ongoing twelve-part writing project. I’m using a random online dice roller to select a monthly topic from a series of twelve pre-selected themes. With this fifth article complete, I roll the die again . . .

. . . and next month I will consider Topic Number Seven: “Creativity”

“Nice flowers, Burns. Gimme ’em, or else I’ll sock you one . . . “

All Articles In This Series:

Credidero: A Writing Project

Credidero #1: Hostility

Credidero #2: Curiosity

Credidero #3: Security

Credidero #4: Absurdity

Credidero #5: Inhumanity

Credidero #6: Creativity

Credidero #7: Community

Credidero #8: Complexity

Credidero #9: Eternity

Credidero #10: Authority

Credidero #11: Mortality

Credidero #12: Possibility

Credidero: An Epilogue

 

 

Credidero #4: Absurdity

My father was born and raised in Albemarle, a North Carolina Piedmont mill and rail town near the Uwharrie Mountains. He left there after college to embark on a long and successful Marine Corps career, living and traveling around the world, but his parents stayed on in the same house on Melchor Drive until they died, Papas before Grannies, both passing when I was in my twenties.

While I never lived in Albemarle, I had two decades’ worth of grandparent visits there, with many fond memories still held dear of those mostly gentle days. Until I developed teenage cynicism and ennui, one of my favorite things about going to Albemarle was hunkering down in a comfy chair to read my grandmothers’ copy of The Golden Treasury of Poetry, edited by Louis Untermeyer. I have that battered copy of the book to this day, as my aunt gave it to me after my grandmother died, knowing that no one else had ever read or loved it as much as I did.

(Amusing [to me] side note: The book was given to my grandmother by her friend, who everyone called “Miz Doby,” in June, 1966. I opened it today and looked at the front-piece inscription and smiled to realize that I still do not know what Miz Doby’s first name was, since she just signed it “E. Doby.” They were both elementary school teachers, so presumably the book was originally intended for my grandmother’s students, before I laid claim to it).

As is often the case with big hard-covers that are regularly handled by children, the spine of the book is cracked, there are stains throughout it, and it’s clear to see where the most-loved, most-read pages were, as they’ve been bent back, breaking the glue that held the pages to the spine. If I just set the Untermeyer book on its spine and let it fall open as it will, it drops to pages 208 and 209, containing Lewis Carroll’s “Jabberwocky” and “Humpty Dumpty’s Recitation.” If I flip to other broken-open pages, I see these poems:

  • “The Owl and the Pussy-Cat” and “Calico Pie” by Edward Lear.
  • “The Tale of Custard the Dragon” by Ogden Nash
  • “Old Mother Hubbard” by Sarah Catherine Martin
  • “The Butterfly’s Ball” by William Roscoe
  • “How To Know The Wild Animals” by Carolyn Wells
  • “Poor Old Lady, She Swallowed a Fly” by Unknown

Some of these poets and some of the poems are better known than the others, but they all do share one prominent recurring similarity: they are all nonsense verses, rhythmically engaging to the ear, deeply earnest in laying out terrific tales without any meaningful anchors in the real world whatsoever. They and others like them could readily be described as “absurdities,” which my desktop dictionary defines as “things that are extremely unreasonable, so as to be foolish or not taken seriously.”

I can still recite “Jabberwocky” by heart half a century on, and my early love of the absurd has pervasively infused both the inputs into my intellectual development, and the outputs of my own creative work, throughout my entire life, and likely through however many years I have remaining before me.  Indulge me three examples on the output side, please: these are short poems that I wrote when I was in my 30s or 40s, clearly related to, and likely inspired by, the doggerel, wordplay, and rhythmic whimsy of those gentler children’s poems in the Untermeyer collection:

“Tales of Brave Ulysses S. Vanderbilt, Jr.”

I don’t know how to make this damn thing go
James Monroe won it in the hammer throw
Won it very long ago
Won it in the hammer throw

Time goes by while we’re learning how to fly
William Bligh dreamed of sour rhubarb pie
Dreamed it with his inner eye
Dreamed of sour rhubarb pie

On the sea, Bligh and Monroe sail with me
One degree south of Nashville, Tennessee
South of Rome and Galilee
South of Nashville, Tennessee

Home at last, feeling like an age has past
Thomas Nast drew us through his looking glass
Drew us as we crossed the pass
Drew us through his looking glass

I don’t know how to make this damn thing go
Even so, sell it quick to Holy Joe
Sell it painted red Bordeaux
Sell it quick to Holy Joe

Sell it with a piping crow
Sell it for a load of dough
Sell it at the minstrel show
Sell it, man, and then let’s go

“Field Agents”

“Let him out, he’s coming now, he’s alone,”
(I can not tolerate the taste of this megaphone).
Deep in the coop, the fox, he sees that some hens have flown,
his cover’s blown, (tympanic bone, Rosetta stone).

And then the hawk drops down from his perch on high,
(spearing the fox through, he lets out a little cry),
Justice is quick here, we stand and we watch him die,
I dunno why (fluorescent dye, blueberry pie).

We pull the poor poultry out from the killing floor
(some of the pups get sick there in the feath’ry gore),
out on the lawn, we stack them up and note the score:
it’s twenty-four (esprit de corps, espectador).

Back in the barn, now, safe in our little stalls
(I watch those damn bugs climbing around the walls),
We sleep and eat hay, waiting ’til duty calls,
as the time crawls (Niagara Falls, no one recalls).

“Natural History”

The ammonites farmed with diazinon
to kill eurypterids beneath the soil.
Which perished there in darkness ‘neath the lawn,
but rose in eighty million years as oil,
which dinosaurs refined for natural gas
to cook their giant land sloths on steel spits.
As sloths were butchered, forests made of grass
rose from the plains to hide the black tar pits,
where trilobites would swim to lay their eggs.
Their larvae flew and bit the mastodons,
while tiny primates scampered round their legs,
feeding on the fresh diazinon.
At night, the primates fidget as they dream
of interstellar rockets powered by steam.

What do these, or the many other poems like them that I have written over the years, mean? Damned if I know. But damned if I also don’t think that they provide better insights into my own psyche and mental processes than the more lucid prose I write professionally and for pleasure. My brain’s a messy thing, and there’s a lot of stuff going  on inside it that doesn’t make a bit of sense, but which nevertheless consumes a fair amount of internal attention and firepower. These absurd little nuggets spill out of my brain easily and frequently, and I enjoy extracting and preserving them. They seem to reflect a particular lens through which I often view the world: it’s astigmatic, has finger-prints on it, is lightly coated with something greasy and opaque that can be rubbed around but not removed, and there are spider cracks latticed throughout its wobbly concave surfaces.

So many of my tastes in the various arts align closely and clearly with this warped view of the world, as though my internal center of absurdity vibrates in recognition and appreciation when presented with similarly incongruous external stimuli. Examples: I have been drawn to surrealist paintings since early childhood, I regularly read books in which language and mood are far more important than linear plot or narrative, and I once did a little feature on the films that move me most, titled: My Favorite Movies That Don’t Make Any Sense At All.

I must admit that since rolling the online dice three weeks ago to decide which of my Credidero topics I would cover this month, I have had to repeatedly tamp down the very strong urge, prompted by the word “absurdity,” to merrily write 3,000+ words of absolutely meaningless gibberish wordplay and call it “done,” rather than actually considering what “absurdity” really means, and processing what I really think and believe about it. And that initial, innate reaction to just be absurd, as I do, has made this a more challenging topic for me to write about than ones that have come before it. Whenever I thought about how to frame the narrative, I always found myself in some sort of “eyeball looking at itself” scenario, an impossible infinite do-loop of self-reflection where I know the mirror and the object reflected within it are both irregularly warped and pointed in different directions, and I don’t (and can’t) quite know what the true image is.

I must also admit that this isn’t the first time I’ve reflected on such matters, even without the formal structure of a public writing project. I have long found that the easiest way to break out of a wobbly self-reflective do-loop has been to create and export a new loop, so I can look at it from the outside, not the inside. When I read the poems reproduced above today (and there are a lot like them in my collection), they strike me as relics of just that type of act or urge: I wrote them as absurdities, I see them as absurdities now, I embrace those absurdities, I know that I created those absurdities, I know that the act of creating them was absurd, and that any attempt to explain them would be equally absurd.

But at least those bits of absurdity now reside outside of me, self-contained and complete, where I can see them more clearly, rather than having them whirring on blurry spindles within me, occasionally shooting off sparks that ignite other bits of weird kindling lodged along the exposed and frayed wiring of a gazillion neurons packed inside my skull. They mean nothing to me objectively, but they mean everything to me subjectively, because they’re so closely aligned with the ways that I think, and what I think about, and how I view the world around me — or at least how I view some world around me, even if it’s not the one I actually live in.

Pretty absurd, huh?

When I do try to order my thoughts on this topic in ways that can be meaningfully communicated to others, I’m struck by the fact that many of the poems in Untermeyer’s great poetry collection for young people are just as absurd as mine are, and just as absurd as the playground chants that kids around the world somehow seem to learn by osmosis, or the songs we sing to little ones, or the goofy talking animal imagery of countless children’s films and television shows. Utterly absurd! All of it, and all of them! But they love it, don’t they, and we seem to love giving it to them, don’t we? When we describe the whimsy of those ridiculous art forms as “absurd,” we imbue the word with fun, and frolic, and laughter and light. Look at the smiles! Look at them! Joy!

Then minutes later, we turn from our young ones, and we check our Twitter feeds or pick up news magazines or turn on our televisions and are confronted with words, actions, or events precipitated by political figures with whom we disagree, and we may scowlingly brand their actions or activities as “absurd” with vehemence, and bitterness, and anger, and darkness in our hearts. Absurdity is somehow colored in different hues when it manifests itself in real-world ways outside of the acts of the creative class, or outside of the bubble of childhood. And rightly so, as is most profoundly illustrated in our current political clime, where elected or appointed public figures routinely engage in acts or spew forth words that are (to again quote the dictionary) “extremely unreasonable, so as to be foolish or not taken seriously.” 

It is to our own peril, unfortunately, when we don’t take such manifestations of public, political absurdity seriously. Talking animals don’t kill people. Absurd public policies do. Nonce and portmanteau words don’t break people’s souls. Propaganda and hate speech do.  Surrealistic imagery does not poison minds. Unrealistic demagoguery does. Absurd fantasy stories about non-scientific worlds do not destroy the real world. Absurd fantasy policies anchored in non-scientific worldviews do — and there is only one real world within which they function and do harm, no matter how fabulously untethered their sources may be.

People with severe mental illness may act publicly in absurd ways, and we sympathetically view that as a part of their pathology. But what are we to make of people without such pathologies who consciously, actively engage in absurd behaviors specifically designed to remove value and meaning from the lives of others? I’d move them from the absurd pile to the evil pile, frankly. And we’d all be better off were we to rid ourselves of their noxious influences, which is why the fact that 50%+ of our country-folk don’t bother to vote at all is, in itself, utterly absurd.

There’s a vast repository of philosophical thought and writing (from Camus and  Kierkegaard, most prominently) dedicated to understanding absurdity and the ways in which it manifests itself in our lives, and how we are supposed to respond to or function in its grip. Not surprisingly, the philosophy of absurdism is built on the same “dark” theoretical frameworks as existentialism and nihilism, where there is a fundamental conflict between our desire to imbue our lives with value and meaning, and our inability to find such objective worth within an irrational universe that has no meaning, but just is. Once again, the nonsense that is charming when fictionalized for children is often appalling when framed as the architecture within which adult humans function. Why try, when in the end we all die, and we will never know why?

It’s easy for me to embrace and understand my own sense of inner absurdity as an adjunct to the whimsical absurdity of youth, but not so easy to reconcile my inner landscape with the often awful external vistas associated with public, political, and philosophical absurdity. Can I love one and hate the other, or is that in itself an absurd mental position? Is there meaning to be found between those poles, or is life just a pointless, endless Sisyphean push up a hill until the rock crushes us for the last time?

I took a stab at framing my thoughts on why we are what we are some years back, and, of course, I framed it as an absurdist piece called “Seawater Sack Guy Speaks.” If pressed about the article and what it says or means, or why I wrote it, I’ll usually frame it as something more akin to the absurd whimsy of youth, ha ha ha, but if I’m honest here, it’s really a bit more than that, and there’s more objective truth about what I believe, or what I will have believed (credidero) within it than there are in most of my absurd writings. It begins thusly . . .

There’s an explanation for why we exist in the form we do, and I know what it is.

We are all about moving little pieces of the ocean from one place to the other. That’s all we are: sacks of seawater that can convert solar energy into locomotive force, so that we can move our little pieces of the ocean around. Unlike most seawater sacks, though, we are conscious of our selves, and this consciousness leads us to question our primary universal role as movers of hydrogen, oxygen, salts and minerals.

Consciousness is an electrochemical process that our particular strain of seawater sacks have evolved. No better or worse or different than a tail, a gall bladder, or an appendix. Because we don’t understand how this electrochemical process works, we use the very same electrochemical process to create mystical, non-biological explanations for its workings.

And it ends with this . . .

I’m not going to be carrying any metaphysical seawater around any metaphysical heaven or hell when my sack breaks down and releases all its atoms, so I figure I should use every bit of the consciousness I’ve evolved, here and now, to enjoy my fleeting, warm, moist moment in the Sun. This is not to say that I’ve a problem with other sacks of seawater whose enjoyment of their own fleeting, warm, moist moments in the Sun involves the belief in something different. If such chemical processes provide them joy or comfort (or at least the chemical processes that cause their seawater to produce such sensations), then such is their right, and who am I to force my chemistry upon them?

I take joy and comfort from just being conscious, and consider that scientifically miraculous enough.

Is that absurd? Yes. Is it a “good” or the “bad” manifestation of absurdity? I think the former, but I know some would say that if I shared it with a child, I’d inflict harm, and some would say that walking around as an adult thinking such thoughts could readily slot me into the pathological spectrum of absurd beliefs and behaviors. And they may be right. I am absurd, I admit it, inside and out — but I am not a philosophical absurdist. I do believe we can glean meaning and value in an unfeeling, unthinking, and unknowing universe. And I do not believe that a fundamental conflict between the quest for meaning and the universe’s indifference to it drives my own inner absurdity.

When I start thinking about these Credidero articles each month, one of the first things I do is to look at the etymology of the word to be considered. “Absurdity” surprised me in its roots: it is a Late Middle English word derived from the Latin absurdum, meaning “out of tune.” That elicited a “huh!” moment from me, as I am also actively, eagerly drawn to “out of tune” music: the first time I ever read about Arnold Schoenberg’s dissonant 12-tone music, I had to hear it; the first time I ever read about the tritone (“The Devil’s Interval”), I had to find a piano so I could play it; my listening library of thousands of songs contains a high percentage of music in which standard, pleasing Western melodic structures are in short supply. I didn’t realize it, but apparently my musical tastes are absurd too. At least I am consistent.

When I considered the concept of internal and external absurdity as a form of musical expression, I was immediately reminded of a wonderful, favorite song by Karl Bartos (ex-Kraftwerk), called “The Tuning of the World.” In it, Bartos writes about wishing that he could believe in God after seeing a haunting Laurie Anderson concert, noting:

I connect to the sound inside my mind
Closer I can‘t get to the divine
I wish I could believe in God
Life would be just safe and sound
I‘d build my house on solid ground
It‘s rather hard to understand
Why some believe and others can‘t
Who rules the tuning of the world?

I don’t know the answer to Karl’s final question there, for Karl, but to whoever rules the tuning of my own world, I am thankful that you left things in a state of wonky intonation with a lot of busted keys and clammed notes and buzzing frets, since I honestly like it better that way, absurdly enough.

Note: This article is part of an on-going twelve-part writing project. I’m using a random dice roller to select a monthly topic from a series of twelve pre-selected themes. With this fourth article complete, I roll the dice again . . .

12die

. . . and next month I will consider Topic Number Twelve: “Inhumanity.”

Caution: This book may detune your world.

 

All Articles In This Series:

Credidero: A Writing Project

Credidero #1: Hostility

Credidero #2: Curiosity

Credidero #3: Security

Credidero #4: Absurdity

Credidero #5: Inhumanity

Credidero #6: Creativity

Credidero #7: Community

Credidero #8: Complexity

Credidero #9: Eternity

Credidero #10: Authority

Credidero #11: Mortality

Credidero #12: Possibility

Credidero: An Epilogue

 

 

Credidero #3: Security

We’re in the midst of a household move right now (from ORD to DSM), which means I’m peeking into those types of deep storage boxes that haven’t been opened since the last time we moved, pondering whether to purge them or carry their contents onward.

In one box, I found an old plastic bag contained four truly ratty, soiled and tattered stuffed animals that my mother must have sent to me at some point when she herself was moving: my childhood “friends” Sister, Rabbit, Bear and Clown. Sister was a hairless kitten (now with only one eye, and originally furry), and you can probably guess what Rabbit, Bear and Clown were. (I guess my creativity with names came later in my childhood development than they did). The fact that I still have those stuffed animals (compounded with the fact that I put them back in the box, carefully) is a powerful, lasting testament to the simple, yet profound, role they played as childhood comfort objects, providing me with a sense of security at a time in my development when I had absolutely no real idea as to all of things there were in life that could cause me harm.

English psychologist Donald Woods Winnicott explored and wrote about the ways in which most children develop security bonds with what he labeled “transitional objects,” which help ease a child as it loses the perceived “subjective omnipotence” of a mother-to-child bond and develops a relationship with an objective reality where the mother, and the child, and objects in the world around them are not a unity. Winnicott further theorized that transitional objects enable children to experience a fantasized bond with their mothers when the latter are apart from them for increasingly long periods of time, and that the Binkies, the Teddies, and all of the other much loved surrogates serve as keys to alleviating anxiety at the very time when children first begin to encounter the complexity and scariness of the real world around them.

Oh, to imagine if security was that simple for us all today as adults! By definition, security is “freedom from, or resilience against, potential harm (or other unwanted coercive change) caused by others,” and the various realms of security that we all contend with or read about regularly — communications security, data security, airport security, food security, home security, national security, homeland security, environmental security, transportation security, to name but a few — make it screamingly clear as to just how many things, people, concepts, and forces out there are either willfully committed to or passively engaged in trying to cause us harm, collectively and individually. We take so many steps, at such great cost, to create warnings, to protect ourselves, and to deter others, where once a good snuggle sufficed to get the job done — at least in our heads, anyway.

But then, on some level, security really is all about what goes on in our heads, given that humans’ abilities to accurately discern, react and respond to risks are notably, provably wonky. We fear sharks, lightning strikes, and plane crashes more than we fear bathtubs, cars, and the things in our medicine cabinets, though more of us are killed by the latter list each year than by the former. Given this fact, there’s an argument to be made that the vast majority of the security steps that we take aren’t actually much different than our childhood transitional objects: we chain and padlock doors at night and feel better doing so, when a rock through a window is a still a perfectly easy ingress approach for anyone seriously committed to harming us or our property. We go through all sorts of security rituals throughout the course of the day, and they comfort us, but does anybody really, truly believe that taking our shoes off at the airport makes our flight experiences any safer? Or is that ritual just a big imaginary virtual teddy bear designed primarily to soothe transportation patrons and providers alike?

That element of “first, assuage concern” is deeply embedded in the very etymological history of the word “security,” which entered the English language in the 16th Century via the Latin “securus,” combining precursor words for “freedom” (se: without) and “anxiety” (cura: care). That’s kind of daunting to consider, especially for a person (like me) wrestles regularly with anxiety as a constant part of my basic biochemical and psychological composition. If security really means nothing more than “freedom from anxiety,” then ipso facto, I’m almost never secure, or at least not when I’m awake! (And as bad of a sleeper as I am, probably not when I am asleep either).

As I ponder that conundrum, I have to note that the very act of being in the middle of a household move provides strong fuel for feeling less than fully secure: most of our belongings — all the grown-up comfort items with which we surround ourselves — were picked up and taken away on a truck two days ago, and I won’t see them again until next week, hopefully all together still, hopefully intact. Then there’s that transitional period of time of sorting things, placing things, hanging things, moving things, figuring out what goes where, and why, and when, that comes with any move, as we rebuild nests, often hoping to create something that’s at least structurally similar to the nests we’ve left behind. Where will I sit to work at the computer? Where will I eat? Where will we watch TV together? Which cabinet did I put the Ziplock Bags in? (Note: I always feel better knowing where the Ziplocks are . . . they are up there with Duct Tape, WD-40 and Windex when it comes to knowing you’ve got the right tools for whatever jobs need to be done, right now).

I have moved enough over the years (27 times, I think) to know that at some point a few weeks or months in, some little switch in the brain pops from one position to the other, and the new nest acquires that crucial sense of place where I feel that it’s right, and it’s comfortable, and it’s home — with all of the ancillary feelings of security that come along with that distinction to follow. There’s still plenty of things to worry and be anxious about, of course, but at least I’ll know where the sofa and the blankets are so I can bundle up and ponder them comfortably without concern for the very physical infrastructure associated with my housing and possessions. And, of course, Marcia and I will be both there in the new nest most of the time (that’s why we’re moving, after three years of frequent separations), and there’s truly no stronger anchor for security than close, regular proximity to those who love and care for us the most. Honestly, at this stage in my life, my favorite part of most days is getting in bed together and holding hands and talking about whatever and saying “I love you” before we go to sleep. That ritual feels wholly secure no matter where it happens (we travel a lot, so we sleep in a lot of different beds), and that’s the deepest core of my sense of safety and comfort and stability as an adult, regardless of what the next day brings.

Which, of course, it always does. While the new home paradigm will be an improvement, I’ll be working remotely three out of four weeks, and that’s a new situation that will take some time to adapt to, and to develop or learn new security rituals. My physical office has its own sense of place for me, too, as does being with my staff in person, and not just via phone or video conference. The organization itself is and will remain secure in the ways that such things are judged, but my place within it is changing, which is cause for some anxiety, which leads to some feelings of insecurity about how things are going to work for me, and around me. I’m not sure, exactly, what sort of virtual stuffed animal will be required in this case, but I know it’s out there, in some form or another. I’ll know it when I hug it, hopefully.

Then the circle spins outward from home and work, in some cases toward the comforting, in some cases toward the scary. We’re financially secure as a family, thankfully, and we have good health care coverage, and are generally healthy for our ages, so those things don’t trouble or worry too much, and I know what I need to do if they do ever move to the front burner of security concerns. Having spent my life with the name “John Smith” and all of the confusion that can cause (e.g. after September 11th, I was routinely escorted away from my family by armed airport personnel for “secondary screening,” since apparently terrorists are also not very creative when it comes to fabricating fake identities), I’ve always been close to paranoid when it comes to computer and information and personal ID security, so I actually probably feel better about that stuff than most people do, since I so assiduously work to protect myself in that regard, having already learned those lessons many years ago. My rituals may be nothing more than rituals, but they push away the “cura” and that’s all I ask for or expect, most of the time.

Having a possibly senile sociopath at the head of our Federal government certainly doesn’t provide me with any good sense of comfort when it comes to national security, and I’ve chosen to largely withdraw from the constant bombardment of reminders of that fact that’s become part and parcel of the modern social media experience. I don’t wish to spend my time being yelled at, even when I agree with people, and that’s the lion’s share of virtual discourse in the public sector at this point, so I reject that, depending instead on a small, carefully curated list of trusted sources who can amicably share discomforting facts with me in a measured fashion that helps to sort things that are legitimate threads to our collective well-being from those that are just hateful noise. The Economist and Electoral Vote are good security blankets from that standpoint: proven, dependable, honest, and familiar. Always happy to curl up with them.

I’m just about finished with a book that discusses at graphic length what’s likely to be the greatest existential threat to me, mine, and ours in the decades (hopefully) that remain in my life: The Uninhabitable Earth: Life After Warming by David Wallace-Wells. I heartily commend it to you, and hope that it might be widely read, and eventually be as widely influential as Rachel Carson’s Silent Spring, which clearly laid out in terse prose what was obvious in front of us at the time, but which we did not wish to address — until we did. Virtually every other form and facet of physical, philosophical, emotional, and structural security surrounding us today has the potential to be irrevocably altered and destabilized in the years ahead by the myriad challenges that a rapidly changing planet is going to place before us, as individuals, as nations, as a species, and as one of many potentially fragile life-forms clinging desperately to the only ball of dirt, rock and water we have and know.

There’s no security blanket, ritual, or teddy bear large enough to hug away the highly tangible security threats that could come from environmental change, and yet their very enormity means that the vast majority of us don’t feel any real, palpable anxiety about them, because they are almost beyond our capabilities to comprehend in any meaningful fashion — never mind having ability to negate or control them. Ironically, if there’s a terminal fallacy embodied in the etymological definition of security as “freedom from anxiety,” that’s probably it: the correlation between that which should cause us anxiety and that which does cause us anxiety is nowhere near as strong as it should be if we are, as individuals and collectively, are to actually create meaningful security barriers from that which can credibly do us harm. We’re not anxious enough when we should be, and we’re too anxious when we don’t need to be, and so our comforting rituals and objects are ultimately just props to support our subjective views of an objective world with no shortage of killer threats swirling around us, literally and figuratively.

Maybe that’s what makes us weirdly, beautifully, stupidly human though, as we create art, and fall in love, and build homes, and work jobs, and write poetry, and look at stars, and continue to find meaning, comfort and joy in the face of the unrelenting entropic forces constantly working to grind us up onto our constituent chemical elements. Oddly enough, despite my innate anxious disposition, I actually do take deep comfort from the idea that no matter what barriers and borders I build around myself, ultimately I’m a small part of a big thing beyond my comprehending, and the best I can do within it is to chase those moments of beauty, and to find those fear-free spaces, however fleeting they might be, and to love and appreciate what I have, when I have it, with others who love and appreciate me. I don’t, and can’t, always practice what I preach in that regard, but I do try, and it feels good to do so, as perhaps the simplest expression of selfish hedonism available to me.

On one hand, I know that the more I focus on those little things, the less I’m doing to respond to those big things, and that’s perhaps a bad trade-off if I take a long-term, macro, evolutionary view of things. (Though on that front, I’ve already spawned and am medically no longer capable of doing so, so from an evolutionary standpoint, I’m already surplus to the Great God DNA’s purposes at this point anyway). But on the other hand, I know that freedom from anxiety feels like a worthy pursuit, and if more of us experienced such freedom, more often, we’d likely be kinder and gentler and more apt to cooperate and collaborate on the structural issues that shape human experience today, including the big scary beast of global climate change and all of its attendant horrors.

“Think Globally, Act Locally” the bumper stickers exhort us, and maybe that’s a good rubric, even though it only works if everyone follows it, and we know that the vast majority of the rapidly developing world’s citizens, flush with the first fruits of middle class consumer experience, are not going to collectively deny themselves the pleasures that we have already experienced, just because they came to them later. On a macro basis, global security in all of its myriad facets is going to get far worse, for a long, long time, in ways we can’t even conceive of today, before it even begins to get better — if it ever can do so, without us first being wiped from the lithosphere like mold from a grapefruit. No matter what the bumper stickers say, there’s nothing I, myself, can do to change that. Nor can you. Nor can even a Democratic U.S. Federal administration fully committed to the most ambitious Green New Deal imaginable, because China, India, Brazil, Russia and countless other nations will not be practicing parties to it, no matter what their leaders’ signatures say on various international accords. It’s an all-or-nothing game ultimately, and the vast majority of players will perish on its board before we actually figure out the rules.

Which isn’t to say that we shouldn’t try to play that game. We should. We must. If for no other reason than to give ourselves the big security blanket that makes us collectively feel that we are in control of uncontrollable forces. It’s collective madness for us not to, and when we become mad collectively, we foment madness individually, with anomie and ennui and atrophy and atomization dissolving the bonds that tie us and shredding the structures that secure us, tenuously, in the nests of our own making. Recycling our plastic bottles and riding our bikes may not make any more real difference to anything than taking our shoes off as we pass through airport security, but the rituals are important in their own rights, and the security, however ill-founded, they provide to us as individuals is deeply meaningful to our experience as feeling, knowing human animals. Maybe, just maybe, if our brains are less filled with the little security anxieties, we might adapt our perceptions of the objective world a bit, so that we may begin to more accurately gauge and respond to those big security threats.

Ultimately, in our time, that “se cura” model of a life without anxiety has to be a myth, an idealized form of heaven on earth, where soon we will be done with the troubles of the world, even as we still live in that world. My brain may be therapeutically broken in the ways that it processes anxiety, but I don’t believe that even the healthiest brains can truly build such elaborate security measures around them to completely preclude them from anxiety either, except perhaps when they are in a state of complete obliteration from chemical or other depressives. Anxiety might even be a form of psychological friction, endemic to the very act of objects/concepts interacting with other objects/concepts and creating heat and energy, without which work cannot be done, physically speaking. Better to harness that heat and deploy it in positive pursuits, rather than denying its very existence, or denigrating those who experience and express it.

Our security rituals and transitional objects might be more meaningful and impactful if they were rooted less in a “se cura” model and more in a “cum minima cura” — with a little anxiety, so we remain mindful, but not paralyzed, attuned, but not hyper-aware, engaged, but not overcome. “Cuminamacurity” isn’t as elegant a word as “Security” in English, but it might be more meaningful one, and a more realistic one for our collective psyches, as we prepare as a species to face challenges and risks that might be collectively greater than any yet put before us.

The little moments remain precious, the little touches remain important, the little objects remain iconic, the little steps remain productive, and on a personal basis, I will pursue and appreciate them as I always have, and they will anchor me, daily, in their comfortable familiarity and emotional warmth. That said, they should not, must not, render me numb to the realities of the world around me, and the real — not imaginary — threats to me and mine, and you and yours, that await there. We must feel at least “cum minima cura” about those realities, to create the friction and heat needed to prepare us to do more than hug fantasias when we’re required to do so by events beyond our individual control. Perhaps that collective sense of edge and unease will serve as the fulcrum upon which change is finally levered, and perhaps that’s the greatest little step than any of can truly take toward building a more secure world for the maximum number of its residents, human or otherwise.

As good as it feels to hug our transitional objects, and as often I’m going to continue to do so, I think I’m also going to try to hug my own anxieties every now and again, if for no other reason than to look at them, understand them a bit better, and maybe decide that they might actually be trying to tell me something that I shouldn’t be hugging away at all.

Note: This is part three of a planned twelve-part writing project. I’m using an online random die roller to select a monthly topic from a series of twelve pre-selected themes. With this article complete, I roll the die again . . .

. . . and next month I will consider Topic Number Nine: “Absurdity.”

There’s always a bigger cannonball coming, sooner or later . . .

All Articles In This Series:

Credidero: A Writing Project

Credidero #1: Hostility

Credidero #2: Curiosity

Credidero #3: Security

Credidero #4: Absurdity

Credidero #5: Inhumanity

Credidero #6: Creativity

Credidero #7: Community

Credidero #8: Complexity

Credidero #9: Eternity

Credidero #10: Authority

Credidero #11: Mortality

Credidero #12: Possibility

Credidero: An Epilogue

 

Credidero #2: Curiosity

The late, great Douglas Adams doesn’t get the same level of credit that some other science fiction writers receive for describing future technologies that actually come to pass (probably because he was too funny to be taken seriously), but there’s no question that his fictional depiction of The Hitchhiker’s Guide to the Galaxy is now every bit as a real and transformative as, say, Arthur C. Clarke’s prescient descriptions of communications satellites, or Jules Verne’s submarines, or H.G. Wells’ “land ironclads” (tanks) or John Brunner’s on-demand satellite TV, or Martin Caidin’s cybernetic prostheses, or countless other hard sci-fi speculative predictions.

First revealed to the world via a radio play in 1978, the fictional Hitchhiker’s Guide was described as “the standard repository for all knowledge and wisdom,” filled with crowd-sourced content because “most of the actual work got done by any passing stranger who happened to wander into the empty offices of an afternoon and saw something worth doing.” The Guide could be updated in real time via the Sub-Etha, an “interstellar faster-than-light telecommunications network” that was used for any type of data transmission across the galaxy. Physically, the Guide was described as “a small, thin, flexible lap computer” encased in a “sturdy plastic cover,” with the words “Don’t Panic” inscribed on it “in large, friendly letters”. (All quotes from Adams’ books, via Wikipedia).

I’m certainly not the first person to note that a modern human carrying a smart phone with real-time access to Wikipedia is essentially toting The Hitchhiker’s Guide around, whether it has large friendly letters printed on its case or not. And if that’s not enough to mark Adams as a singular visionary, note that he actually started a web-based, crowd-sourced, real-world version of the Guide called h2g2 in 1999, two years before Wikipedia was launched, in the same year when Adams himself passed away at the terribly young age of 49. Had he not shuffled off this mortal coil in such an untimely and unexpected fashion, we might today all be using Adams’ h2g2 for all of our search needs, instead of Jimmy Wales’ titanic digital encyclopedia. That said, you can still access (and contribute to) h2g2 if you’re so inclined, and it does provide a healthily irreverent counterpart to Wikipedia’s sometime stuffy and over-curated content at this point.

It’s worth noting (to me anyway) that we are fast approaching another interesting singularity point between the fictional guide and its primary real-world analog. In So Long And Thanks For All The Fish, the fourth book in the trilogy (yeah), the Guide‘s tally of pages is cited as 5,973,509. As I type this article, the real number of pages on the English version of Wikipedia is posted as 5,817,575. I certainly hope that someone at the Wikimedia Foundation is monitoring this number, and properly celebrates Adams’ estimation of the number of pages that it takes to describe the galaxy and all of the things in it when somebody creates page number 5,973,509. I’m guessing that will happen in 2019. I’ll be keeping an eye on it.

For all of Adams prescience, I think there’s one way in which he missed the mark on the ways that sentient beings might deploy the Hitchhiker’s Guide. The book’s protagonists routinely use the Guide to acquire necessary, (mostly) useful information to get them out of, or into, various scrapes and predicaments, but it’s generally consulted in response to such external stimuli, rather than being consulted just for the sake of being consulted. Had Adams written the books today, now knowing what we know about how we know what we know, I suspect there would be lots of scenes where people (human and otherwise) just loll about in their various spacecraft and on their various planets, pointing and asking and clicking and reading and browsing for no other reason than because they can, and because they are innately, inherently, and often flat our insanely curious about all of the things in the universe, all of them.

That’s certainly how I interact with the world of information when I’m sitting at my static desk-top, clicking and clattering away. I can read something, or think of something, and not know some arcane piece of information about said something, and then suddenly find myself in an hours-long slide into data gathering and information processing that typically ends up far from where it began, leaving my head filled with a bunch of new noise, much of which will be forgotten hours after I first apprehend it. And then I’ll do it all again. And again. And again. And I will be happy all the while, even if I’ve not achieved anything meaningful in the process.

The mobility of my information gathering devices means that I do this in the “real world” too, as I encounter non-electronic stimulus: What’s that bird? How tall is that building? Where does this road go?  Who is that park named for? What kind of plane was that? Who wrote that song? What was its lyric again? Who played bass on it? What else did he or she do? Another bird? What was it? We live in a truly glorious age when it comes to assuaging our curiosity in this fashion, as the ability to itch the scratch or scratch the itch of not knowing things is effortless and immediate and (mostly) satisfying, even if much of the information that we pack into our noggins is the intellectual equivalent of a big bag of Cheetos: filling, colorful, possibly addictive, and of no practical, nutritional good whatsoever.

Which begs the question as to whether an active sense of curiosity (much less an over-active one) and the time spent assuaging it, is a good thing or a bad thing. Because sometimes we’re curious about things that we really should not be. You know that after the fictional Hitchhiker’s Guide waxed so profoundly about (say) the perils of Vogon poetry, that some sizable number of readers would have immediately sought out some of those noxious texts out to read them, and suffer in the process, just as people visit various pages of horrors on the real-world internet, all the time. I’ve never heard of anybody really having a seizure from a website promising to deliver one, but I know that they exist, and I know that people look at them, just because they can. (Please don’t go find one now). (And do not think about elephants). (Are you thinking about elephants?) (You are, aren’t you). (That’s better than thinking about seizure robots, anyway).

I suspect that many damaging online pornography addictions are fueled by unhealthy curiosities: if a human body can do this, and I can find it online and look at it, then I wonder if a human body can do that, and if so, where can I see it?  The market for Faces of Death-type collections of carnage imagery predates the internet, but once upon a time they were hard to find, whereas now: search, click, look, regret. When people watch cell phone videos of people being gunned down in their cars, or on the streets, or in their homes, or of bombs being detonated in public spaces, or of the beheading or hanging of political captives, they may say they’re doing it as part of some refined sense of social justice, wanting to share and experience such pain with its victims in more meaningful ways, but I can’t help but think that morbid curiosity of that nature is just a digital form of rubber-necking at an auto accident, ultimately nothing more than the insatiable curiosity to see what something terrible looks like, coupled with an inability to resist it. And I’m pretty sure that’s not a good thing.

Unfortunately, it often seems that the bad outcomes of curiosity anchor a lot of the ways in which we educate and raise our young in modern western cultures. “Curiosity killed the cat” is an adage we learn fairly early on. Later, we might encounter books or television shows about Curious George, a charming simian simpleton whose insatiable curiosity gets him into all sorts of trouble, requiring the Man in the Yellow Hat or other sensible adults to bail him out, so he can curiously investigate the next shiny thing that catches his eye. The classics take similar stances: Pandora’s curiosity about her now-eponymous box unleashed sin, disease and death upon the world, and the Serpent in the Garden of Eden used Eve’s curiosity against her to bring on the Fall of Man.

The Bible even explicitly exhorts us to mind our own business and not ask big questions: “It is not for you to know times or seasons that the Father has fixed by his own authority” (Acts 1:7). That feels like the ultimate “because I said so” answer to every “why?” question that every child puts forth, communicating that some things are just not knowable, no matter how much we want to know them. And maybe that quiets a child, or the childish being cooped up within an adult, for some period of time, but it doesn’t assuage the desire for knowledge, it just makes it feel wrong. Which, in turn, itself seems wrong, since curiosity is by all objective measures a key component in the process of learning, and the acquisition of knowledge, if not wisdom.

Education is a key component of cultural inculcation, and it seems that it would be a whole lot easier to harness the innate curiosity of youth rather than censuring it. Perhaps this pervasive conundrum hinges on adults wanting children to learn certain things, in certain times, in certain ways, rather than openly figuring the world out as it presents itself to them, naturally. Education as a form of control, as it were. And if your curiosity persists in carrying you in directions other than those in which we wish to point you, we now have medications to take the edge of that itch, so that you can concentrate on this here algebraic formula, and not that there way cool bug crawling up the wall in the back of the classroom. You won’t be able to balance a checkbook by knowing its name, now will you? And it might sting you, anyway. Pay attention.

Our pets might actually have it better than our children on this front, since we’re generally content to let them sniff and snuff at whatever captures their fancies, so long as they don’t do it on the furniture, or strain too hard against the leash. While I find the entitled over-pampering of American pets to be mostly absurd, I do think that it’s a good thing that we’ve generally come to understand and accept that our non-human companions, and loads of non-domesticated non-human animals, can be just as curious as we are about the worlds in which they find themselves, investigating their surroundings with agency, and individuality, and intellect, and not just as mindless automatons driven by species-encoded patterns and instincts. The searches for food and water and mates and shelter are certainly compelling, but they’re not the end-all and be-all of animal experience, and it’s a joy to watch any being, of any species, happily exploring its world, and eagerly investigating stimuli beyond its normal experience.

It has taken billions and billions of years for hydrogen, carbon, oxygen and nitrogen to organize themselves in such a way that our species can actively, consciously think about that organization, and how it happened, and what it means, and how it fits in within everything else in the visible and invisible cosmos. Give them another billion years or so, and some of our cetacean, simian, corvid, canine, porcine and feline friends might join us in this pursuit; we’re not likely special in this regard, other than being first to cross the bar of conscious, tool-based scientific inquiry. (On our planet, anyway). Viewed this way, it seems that our innate desire to want to know all the answers, to all the things, might be something of a birthright for our species, and that squandering our little moment in the sun — brief as it’s been in celestial terms, and fleeting as it might be in a solar system filled with planet-killing objects and opportunities — would be a refutation of eons and eons of evolutionary progress, not necessarily with us an end point, but perhaps with us as a conduit to something unknown, but not unknowable.

So I might not be touching the divine when, on a whim, I get online to remind myself who played guitar on the second Toe Fat album from 1971 (Alan Kendall, for the record), but I am actively engaging the part of my brain that’s evolved to crave information and stimulus that has no bearing on my ability to breathe, or sleep, or breed, or eat. Knowing that scrap of information doesn’t make me a better human being by any meaningful measure, but finding it does give me a fleeting chemical pleasure, and that little “ah ha” may trigger other chemical cascades that do make me just a bit sharper than I might have been otherwise, or maybe it will serve as a conversation point years hence that might make other chemicals flow in ways that turn an acquaintance into a friend, or a friend into a follower, or a follower into an explorer. That seems positive, in a little way, and lots of little ways pointed in the same direction can become a big way, to something, again unknown, but knowable.

When I ponder what a personal end of days might look like, I tend to think that losing the desire for these types of inquisitions will be among the key dominoes falling in an ultimately failing physical system, and I’m going to rage, rage against the dying of that light, for as long as I can. For all of the emotional negatively that morbid curiosity might theoretically inflict upon me, were I more prone to explore it, I can’t help but think that the emotional positivity of eager, open, innocent investigation of the world around me will always return a net positive position for the time and energy spent in its pursuit. If I am the sum total of my experiences, then my curiosity, more than anything else, is what makes me me. And your curiosity, more than anything else, is what makes you you. And the glorious variety possible through endless permutations of those equations is what makes so much of life so very enjoyable, in ways that I hope to remain always curious about, until I disperse the carbon, hydrogen, oxygen and nitrogen that composes me, so that other curious entities might form from it.

Curiosity may indeed kill a cat, every now and again, but for each one that goes to the great litter box in the sky as a result of its investigations, thousands of others end up with the ball of yarn, or the catnip mousie, or the comfy, comfy comforter, or the warm pile of laundry, or the tasty gazelle, possibly with a friend who might be another cat, or a duck, or a dog, or a human child, bursting with enthusiasm to know what that cat feels like, and why it’s tail curls that way, and how come it makes biscuits with its paws, and where its kittens came from.

I’m with those cats, when all’s said and done. Let’s chase this string and see where it leads us . . .

Which state quarters are you missing?? I have to know!!!

Note: This is part two of a planned twelve-part writing project. I’m using an online random die roller to select a monthly topic from a series of twelve pre-selected themes. With this article complete, I roll the die again . . .

. . . and next month I will consider Topic Number Ten: “Security.”

All Articles In This Series:

Credidero: A Writing Project

Credidero #1: Hostility

Credidero #2: Curiosity

Credidero #3: Security

Credidero #4: Absurdity

Credidero #5: Inhumanity

Credidero #6: Creativity

Credidero #7: Community

Credidero #8: Complexity

Credidero #9: Eternity

Credidero #10: Authority

Credidero #11: Mortality

Credidero #12: Possibility

Credidero: An Epilogue