On Community

Note: Here is my “Leading Thoughts” column from the September 2019 edition of TREE Press, the monthly gazette of TREE Fund. You can read the latest and back editions, and subscribe to future installments, by clicking here. This article was adapted from a much longer piece written earlier this year, and available on my website here, for those who are interested in reading more about my views on “community.”

If you were to create a word cloud of every document, article, letter, and email I’ve written during my four-plus years as President and CEO of TREE Fund, I suspect that after the obvious mission-related words — tree, forest, research, endowment, education, arborist, etc. —  the word that would show up most frequently would be “community.” I use it all the time, referring to the Tour des Trees as our primary community engagement event, discussing how our work helps the global tree care community, noting that our work focuses on the importance of urban and community forests by promoting research designed to benefit whole communities of trees and related organisms (including humans), rather than individual specimens or species.

If you ran that same word cloud for the four years before I arrived at TREE Fund, I suspect you would not see “community” ranked so highly in our communications. We used to refer to the Tour des Trees as our primary fundraising event, and we discussed how our work benefited the tree care industry, and how our efforts advanced arboriculture, with much of our research focused on individual plant response, rather than forests as a whole. This change in language was not necessarily an organizational shift driven by some strategic planning decision, nor was it a modification to what we do and how we do it directed by our Board or emergent outside forces. It was frankly just me shaping the narrative about the organization I lead, and I how I want it to be perceived.

Calling the Tour des Trees just a “fundraising event,” for example, misses the critical component of how we interact with people as we roll on our way throughout the week, providing education and outreach to help people understand our work and how it benefits them. Saying that we work only for the “tree care industry” seems somehow antiseptic to me, implying that the businesses are more important than the community of people they employ, who collectively engage in the hands-on work of caring for trees. “Urban and community forests” is a helpful rubric in expressing the full scope of our focus, evoking and including big city park spaces, street trees, yard trees and trees along utility rights of way in suburbs, exurbs, and rural spaces. And thinking more about communities of trees, rather than individual plants, helps us better understand and communicate the exciting, emergent science exploring the ways that trees have evolved as communal organisms, and not just as disconnected individuals.

I think my focus on the word “community” is indicative of its deep importance to me, personally and professionally. My desire over the past four years, and hopefully into the future, is that TREE Fund acts and is perceived as part of something bigger and more connected than our relatively small physical, financial and personnel structure might otherwise dictate. I have been awed, truly, by the immense generosity, enthusiasm, wisdom and diligence of the global tree care community, and it has been an honor for me to be a small member of that great collective body, which works wonders, and makes a difference.

Getting ready to rejoin this great community of tree-loving cyclists again this weekend. You can click the photo if you want to make a last minute Tour des Trees gift to support the cause!

Credidero #8: Complexity

The concepts of “complexity” and “divinity” seem to be inextricably interwoven in much of Western religious and cultural thought. One of the most famous renderings of this philosophical and teleological duality is “The Watchmaker Analogy,” which was explored at length in English clergyman William Paley’s 1802 treatise Natural Theology or Evidences of the Existence and Attributes of the Deity. Paley opened his tome thusly:

In crossing a heath, suppose I pitched my foot against a stone, and were asked how the stone came to be there; I might possibly answer, that, for anything I knew to the contrary, it had lain there forever: nor would it perhaps be very easy to show the absurdity of this answer. But suppose I had found a watch upon the ground, and it should be inquired how the watch happened to be in that place; I should hardly think of the answer I had before given, that for anything I knew, the watch might have always been there . . . Every indication of contrivance, every manifestation of design, which existed in the watch, exists in the works of nature; with the difference, on the side of nature, of being greater or more, and that in a degree which exceeds all computation.

The gist of Paley’s argument boils down to the presumption that when you find a watch, there must be a watchmaker.  So, therefore, when you find a stone, there must also be a stonemaker. And then when you find a perfectly articulated shoulder joint, there must be a perfectly articulated shoulder joint-maker. And then when you find a flaming bag of poop, there must be a flaming bag of poop-maker. Well, okay, actually Reverend Paley didn’t mention that last one. It was just the anchor concept from a humorous collaborative piece I wrote many years ago, in which some colleagues and I envisioned a dialog between Charles Darwin (in Hell) and The LORD about Paley’s Watchmaker Analogy. It piqued my curiosity enough to explore it all those years ago, even if in a satirical form, and it was the very first thing that popped to my mind when I rolled the 12-sided die last month and had “Complexity” selected as the topic of this month’s Credidero article.

Charles Darwin himself also spent a fair amount of time thinking about The Watchmaker Analogy, well before he went to Hell, even. Darwin was aware of and fond of Paley’s work, and scholars have theorized, with clear reason and reasoning, that Darwin’s explanations of natural selection in On The Origin of Species are actually framed and intended as respectful scientific counter-arguments to those made in Natural Theology.  Even Richard Dawkins, the high priest of neo-atheists and father of all memes, evokes Paley in the title of his influential 1986 tome The Blind Watchmaker. The good Reverend’s final book remains in print, and is a cornerstone text in modern “intelligent design” circles. Those are sure some long and limber legs for such a nominally arcane older text.

Given his longstanding popularity and cultural resonance, if you want to frame arguments for or against complexity as a function of a divine creator, Paley’s as good of a starting point as you’re likely to find. Unless, of course, you’re too much of a fundamentalist to see his work as anything more than a derivative text, and you just want to jump straight to the opening lines of the primary text upon which all of Western (and by Western, I mean American) religious culture has been erected:  “And the Earth was without form, and void; and darkness was upon the face of the deep. And the spirit of The LORD moved upon the face of the waters. And The LORD said, let there be light: and there was light,” sayeth the Book of Genesis, which many Evangelical types interpreteth as the literal Word of The Lord, their God and Savior. A few verses later, as we mostly all know, The LORD went on to make day and night, and stars and sky, and land and seas, and the sun and the moon, and plants and animals, and mankind and naps, with each day’s creations more complex than the ones that came before.

As the very first appearance of The LORD in The Bible highlights His ability to create complexity where none existed before, that seems to be the professional trait of which He (or his public relations team) is most proud, and He continues to conjure up something from nothing (stepping up complexity every time) throughout the Old Testament, in between all the smiting and the flooding and the worrying about what women are up to with their bodies that He so seems to enjoy in His spare time. Then later, when The LORD’s only begotten Son decides to unveil his own formidable chops as proof of his divinity in the New Testament, He does it by creating wine from water at a wedding party, simply by adding that magically divine special ingredient: complexity. Bro-heem Christ could have just ended his career right there and still been a legend.

The underlying viewpoint that increasing complexity requires some force greater than that which mere humans can muster isn’t restricted to matters of natural science. Case in point: Erich Von Däniken’s 1968 book Chariots of the Gods? Unsolved Mysteries of the Past, and the many sequels and imitators in print and on screen that followed it. The central argument of these tomes is that the design and construction skills behind ancient objects like the Great Pyramids or Nazca Lines or Stonehenge or the Easter Island statues were complex beyond human capabilities at the time, and therefore must have required inhuman assistance, only in this case not from The LORD, but rather from super-intelligent extraterrestrial beings.

I must admit that I ate those books up as a kid, their logic seeming so very obvious and profound to my 10- to 12-year old mind. But I’d certainly raise my eyebrows and smirk these days at anybody over the age of 16 or so who cited them as part of their mature understanding of the world in which we live, just as I do with people who consider the works of Ayn Rand to be rational adult fare. If we can’t figure out how something complex was built or got done, it seems like intellectual defeatism to simply attribute it to super-powerful unseen entities — either divine or extraterrestrial or John Galt — rather than just working harder to figure it out, and then recognizing that humanity’s ability to create complex objects and artifacts does not necessarily proceed in a linear fashion.

We cannot build a Saturn V rocket today, to cite but one of many examples. That doesn’t mean that those epic machines were built by Jesus, or dropped on the launch pad at Cape Canaveral by Bug-Eyed Monsters. It just means that the industrial base required to build them doesn’t exist anymore as our Nation’s economic, political and military needs evolved. Which I strongly suspect is also the case with pretty much every one of Erich Von Däniken’s examples of the allegedly extra-human skills required to build all of those ancient wonders. Humans, then, now, and in the future, are capable of great complexity in our creations, and if we care enough about something  — putting a man on the man and bringing him home safely before the end of the decade, building a tomb that will last forever, or impressing the ladies on the other side of the island with the chunky heft and knee-melting girth of our massive stone heads — then we’ll work marvels large and small to get ‘er done.

So why do so many people default to the notion that immense complexity requires some form of divinity as its motive force? I suspect it is because the natural order of things is to reduce complexity — bodies to ashes to dirt to dust, cities to ruins to iron to rust — so when we see something, anything, pushing valiantly against the never-ending corruption of eternal entropy, we are temperamentally inclined to judge it special, and meaningful, and not just an anomalous series of natural processes organized in a particular way that slows or reduces or offsets entropic forces for some (likely brief) period of time. In the observable universe, entropy always wins in end, so if we want to believe that acts and examples of increasing complexity are permanent, then a belief in something outside of or beyond that which can be known and understood with our senses and intellects is a darn good way to avoid dwelling on the fact that we and our creative works are going to die and become dirt at some point in the very, very near future, speaking in cosmic time scales.

In looking at casual Christian theology and practice, I sort of like the next order of logical thought that tumbles from this one: if complexity is the realm of God the Watchmaker, then is its opposite, entropy, the particular expertise of Lucifer, His Enemy? The written record on Beelzebub is certainly rife with destruction, and fall, and spoilage, and violation, and war, and madness, and off the cuff, I’d be hard-pressed to think of examples where the Crimson King showed complex, creative chops like those the Father, Son and Holy Spirit trot out at the slightest provocation to bedazzle their admirers. God makes, the Devil breaks, and as long as The LORD stays one step ahead of his nemesis, order prevails.

But if The LORD spent too much time watching over one particularly needy tiny sparrow, would Old Scratch turn the tables on Him (and us) and pull apart the fabric of the known and knowable? I think that when the Beast and the False Prophet and the Dragon are finally cast into the Lake of Fire in the Apostle John’s Book of Revelation, what we’re seeing is actually a metaphorical depiction of the final removal of entropy from the world. I’m guessing that the New Heaven and New Earth and New Jerusalem were seen by Saint John on Patmos as the most fabulously complex constructs imaginable in his time, and that most readers of the Apocalypse since then also envision them in such terms, defined by the norms of whatever time and place that they are pondered. The LORD’s not gonna come live with His people in a humble casita or pre-fab double wide now, is He? Nuh uh. The buildings in that glorious end-of-times city are going to have flying buttresses upon their flying buttresses, and there might even be a Saturn V pad in every yard, plus unlimited pocket watches available upon demand.

I recognize, of course, that I’m being a bit silly here in my analysis of complexity as it’s defined by The Watchmaker Analogy, just as I was being a bit silly when I first wrote The Flaming Bag of Poop-Maker circa 2003-2004. And I guess that’s because whenever I think about that particular argument for the existence of a Supreme Being, it just seems so very obviously and inherently ridiculous to me that responding in kind is the only logical approach to tackling it. There are so many arguments for the existence of God, and so many of them seem more sound and embraceable to me than Reverend Paley’s. I suppose my opinion might be different if I actually thought that The LORD created the world over seven days, some 6,000 years ago (thank you very much, Archbishop Ussher), but given 4.5 billion years for our planet’s natural forces to do their things, with the universe as a whole having an 8.3 billion year head start on our stellar system, I’m not in the least bit surprised by magnitudes of complexity far beyond all human understanding, since we’ve only been collectively pondering such matters for (at most) about 0.2% of Earth’s natural history.

To be clear, though, this does not in any way mean that I do not marvel regularly at the complexity of creation, even if creation created itself. I’m truly and deeply awed by so many complex natural things, from the little creepy-crawly ones that I rescue when I see them on sidewalks to the immense ones light years and light years away from us that I gaze at in stupefaction during the (increasingly rare) times when I have an unobstructed view of a night sky free from light pollution. I’m amazed by the complexity of my own body (creaky as it is), and by the complexity of my own brain’s machinations (awake and asleep), and by the complexity of the sea of emotions in which I swim, loving this, ignoring that, hating the other. I’m well read, reasonably smart, and actively interested in understanding how things work, and I still can barely perceive the tiniest bits of what natural selection has wrought upon the living things around me, as we all hop atop a ridiculously complex ball of elements and minerals and fluids, all governed by forces strong and weak, gravitational and electromagnetic.

Really and truly, I don’t perceive natural complexity as proof of divinity, I perceive natural complexity as divinity in its own right. The complex and ever-evolving canons of chemistry, physics and biology are the closest things I’d admit to admiring as sacred primary texts. I could spend a lifetime studying them, and understand as much as my brain could possibly absorb, and still I would be awed beyond comprehension by the complexity of the natural order in which I function, for the very short, sweet, warm time that I’m blessed to be a self-regulating blob of motile biochemical materials, animated by a denser blob inside my beautifully complex upper bony structure, within which everything that is really, truly me resides, amazingly and incredibly distinct from all of the universe’s possible not-me’s.

At bottom line, I don’t need to worship a fanciful Watchmaker, because I am perfectly content to worship the Watch itself. And the stone next to the Watch on Reverend Paley’s heath. And the tiny dinosaurs that hop around the stone, cheep cheep cheep! And the moo-cow that might pass us all by, chewing a cud rich with uncountable organic oozes, as I talk to the Cheep Cheeps, wishing I had some sunflower seeds in my pocket for them. And the 4,000 miles of metal and stone between me and the Earth’s center as I look down, and the uncountably, immeasurably vast distances above me as I look up, gazing billions of years into the past, perceiving light from way back then as it arrives in the right now on its way to the yet to come. There’s nothing in the Book of Genesis that can rival that, if we’re going to fairly assess things. Nor in Atlas Shrugged.

And now I’ve swung from a most silly approach to assessing complexity to a most abstractly profound one, likely more than two standard deviations away at both ends of the spectrum from how normal people in normal times in normal places would perceive normal complexity. Whatever “normal complexity” might be, anyway. Perhaps that’s an oxymoron? Perhaps it’s a phrase that doesn’t normally exist because it doesn’t need to? Or perhaps it’s just a simple way to describe the chaotic world in which we live and work, driven by complex forces that we often do not see, recognize or appreciate?

I’m inclined to grab that third explanation/definition when thinking about human complexity in human-driven spaces. There’s lots of stuff that we collectively create swirling around us, and when I ponder that, I’m still most drawn to the most complex examples of it, most of the time. I like the Ramones well enough, to cite a musical example, but I adore the far-more-complex King Crimson. Likewise in my taste for visual arts, where extreme abstraction and deeply technical compositions move me far more than literal still lifes and figure studies. My list of top movies is also rife with multi-layered surrealist complexities, while I tend to forget simpler character-based rom coms hours after I watch them. Books? I’ll take the complex Gormenghast Trilogy and The Islanders and The Flounder over the simpler The Old Man and the Sea and Of Mice and Men and The Call of the Wild any day. Man-made creative complexity is good in my eyes. It resonates with me. It moves me. It inspires wonder in me. It represents the spaces where we become most Watchmakery, to return to Reverend Paley’s paradigm.

There’s one weird exception when it comes to my love for complexity, and that would be work. I’m a deep devotee of the “keep it simple, stupid” paradigm in the office, and if you interviewed anybody who’s ever worked for me over the past 30+ years, they’d likely cite my penchant for process streamlining and organizational simplification, and my loathing for clerical redundancy and structural inefficiency. When it comes to my professional work, less complex is more desirable, almost to a point of fetishism. I suppose this could be explained altruistically, with me taking the position that my time equates to my organizations’ money, so that deploying my own time and the other human resources around me most efficiently represents a truly ethical approach to stretching our resources as far as they may be stretched. But I think the honest reality is that I view work as a thing that has to be completed before I can play with the complex things that move me more deeply, so by taking the least moves possible to achieve desired professional outcomes, I can preserve the energy I need to take the most moves possible toward the complexities that most amuse and entertain and inspire me. “Wasting time on the man’s dime, yo!” There’s a professional creed to motivate the masses, for sure.

If simple work is the opposite of complex fun, just as entropy is the opposite of creation, just as the Devil is the opposite of the Watchmaker, then we’ve got to wrap back around to opening arguments and conclude by accepting that work must be the purview of Satan, and play must be the purview of God, and that we model ourselves most clearly in His image when we frolic in fields of phlox and fescue and philosophy and felicity and feeling and friends and family and festivity and fun.

I’m ultimately happy to embrace such a simple truth when staring into the awesome face of such a stupidly, gloriously complex universe as ours!

Step aside, simple ones! Complex Nazca Line Building Alien coming through!

Note: This article is part of an ongoing twelve-part writing project. I’m using a random online dice roller to select a monthly topic from a series of twelve pre-selected themes. With this eighth article complete, I roll the die again . . .

. . . and next month I will consider Topic Number Eleven: “Eternity”

All Articles In This Series:

Credidero: A Writing Project

Credidero #1: Hostility

Credidero #2: Curiosity

Credidero #3: Security

Credidero #4: Absurdity

Credidero #5: Inhumanity

Credidero #6: Creativity

Credidero #7: Community

Credidero #8: Complexity

 

Credidero #7: Community

If you were to create a word cloud of every document, article, letter, and email I’ve written during my four years as President and CEO of TREE Fund, I suspect that after the obvious mission-related words — tree, forest, research, endowment, education, arborist, etc. —  the word that would show up most frequently would be “community.” I use it all the time, referring to the Tour des Trees as our primary community engagement event, discussing how our work helps the global tree care community, noting that our work focuses on the importance of urban and community forests, by promoting research designed to benefit whole communities of trees and related organisms (including humans), rather than individual specimens or species.

If you ran that same word cloud for the four years before I arrived at TREE Fund, you most likely would not see “community” ranked so highly in our communications. We used to refer to the Tour des Trees as our primary fundraising event, and we discussed how our work benefited the tree care industry, and how our efforts advanced arboriculture, with much of our research focused on individual plants, rather than their collectives. This change in language was not an organizational shift driven by some strategic planning decision, nor was it a modification to what we do and how we do it directed by our Board or emergent outside forces. It was frankly just me shaping the narrative about the organization I lead, and I how I want it to be perceived.

Calling the Tour des Trees a “fundraising event,” for example, misses the critical component of how we interact with people as we roll on our way throughout the week, providing education and outreach to help people understand our work and how it benefits them. Saying that we work for the “tree care industry” seems somehow crass and antiseptic to me, implying that the businesses are more important than the people who collectively engage in the hands-on work of caring for trees. “Urban forests” can be confusing to folks in its evocation of big city park spaces, even though street trees, yard trees and trees along utility rights of way in suburbs, exurbs, and rural spaces are also part of our mission’s purview. And thinking first of communities of trees, rather than individual plants, helps us better understand and communicate the exciting, emergent science exploring the ways that trees have evolved as communal organisms, sharing information and nutrients through root-based symbiotic networks.

I’d be fibbing if I said that I had purposefully made these and other related linguistic changes as part of an intentional, organized shift in tone. It just happened as I went along, and it honestly didn’t actively occur to me that I had done it in so many ways and places until I started thinking about this month’s Credidero article. But the changes are clearly there, evidence of the fact that it’s somehow deeply important to me, personally and professionally, that TREE Fund acts and is perceived as part of something bigger and more connected than our relatively small physical, financial and personnel structure might otherwise dictate. I do believe that words have power, and if you say something often enough, and loudly enough, that people begin to perceive it as true, and then it actually becomes true, even if nothing has really changed except the word(s) we use to describe ourselves and our activities.

So why is “community” such an important and transformative word in my personal worldview? As I normally do in these articles when thinking about questions like that one, I looked into the word’s etymology: it comes to us English-speakers via the Old French comuneté, which in turn came from the Latin communitas, which ultimately boils down to something “shared in common.” But there’s a deeper layer in the Latin root that’s preserved to this day in cultural anthropology, where communitas refers to (per Wiki) “an unstructured state in which all members of a community are equal allowing them to share a common experience, usually through a rite of passage.”

The interesting corollary here, of course, is that those who do not or cannot participate in that rite of passage may neither partake of nor receive the benefits of communitas. Peter Gabriel’s “Not One Of Us” has long been one of my favorite songs, both musically (Gabriel, Robert Fripp, John Giblin and Jerry Marotta put in some sublime performances here) and lyrically, with one line standing out to me as a key bit of deep wisdom, writ large in its simplicity: “How can we be in, if there is no outside?” That deepest root of the word “community” captures that sense of exclusion: there’s a positive sense of belonging for those who have crossed the threshold for inclusion, while those who haven’t done so are (to again quote Mister Gabriel) “not one of us.”

So are many (most?) communities perhaps defined not so much by who they include, but rather by who they exclude? I suspect that may be the case. When I first arrived at TREE Fund, for example, I had a couple of early encounters and experiences where folks communicated to me, explicitly and implicitly, that they saw TREE Fund not as a cooperative symbiote, but rather as predatory parasite, on the collective body of tree care professionals and their employers. I was also made to feel uncomfortable in a few situations by my lack of hands-on experience in professional tree care, including the fact that I had no certification, training, or credentialing as an arborist or an urban forester. I had not passed through the “rite of passage” that would have allowed me to partake of the tree peoples’ communitas, and so in the eyes of some members of that community I was (and probably still remain) on the outside, not the inside. So my push over the past four years for TREE Fund to be an integral part of a larger professional community may be, if I’m honest and self-reflective, as much about making me feel included as it is about advancing the organization.

When I look bigger and broader beyond TREE Fund, I certainly still see a lot of that “inside/outside” paradigm when it comes to the ways in which we collectively organize ourselves into communities, locally, regionally, nationally, and globally, oftentimes along increasingly “tribal” political lines, e.g. Blue States vs Red States, Republicans vs Democrats, Wealthy vs Poor, Christian vs Muslim vs Jew, Liberal vs Conservative, Citizen vs Immigrant, Brexit vs Remain, etc. Not only do we self-sort online and in our reading and viewing habits, but increasingly more and more people are choosing to live, work, date, marry, and socialize only within circles of self-mirroring “insiders,” ever more deeply affirming our sense that the “others” are not like us, are not part of our communities, and may in some ways be less important, less interesting, less deserving, or even less human than we are.

That’s certainly the narrative being spun by our President right now through social media, spoken statements, and policy initiatives, as he seems adamantly opposed to “an unstructured state in which all members of a community are equal.” Which is dismaying, given the allegedly self-evident truths we define and hold in our Nation’s organizational documents, ostensibly designed to bind us as a community under the leadership of a duly-elected Executive, who is supposed to represent us all. That said, of course, we know that the infrastructure of our great national experiment was flawed from its inception in the ways that it branded some people as less than fully human, and some people as not qualified to participate in the democratic process, due to their skin color or their gender. I’d obviously like to think that we’re past those problems, some 250 years on, but the daily headlines we’re bombarded with indicate otherwise. Insiders will always need outsiders . . . and communities may often only feel good about themselves by feeling bad toward those they exclude. I suppose several thousand years of history show that may well be a core part of what we are as human beings (I explored that theme more in the Inhumanity Credidero article), and that maybe aspiring to create positive communities of inclusion may be one of the nobler acts that we can pursue.

I’m stating the obvious in noting that the ways we can and do build community, for better or for worse, have radically changed over the past 25 years or so with the emergence of the world wide web and the transformations made possible by it. If you’d asked me to describe what “community” meant to me before 1993, when I first got online, I’d likely have focused on neighborhoods, or churches, or fraternal organizations or such like. I’d say that within less than a year of my first forays into the internet’s kooky series of tubes, though, I was already thinking of and using the word “community” to refer to folks I romped and stomped with online, most of whom I’d never met, nor ever would meet, “in real life.”

I wasn’t alone, as the word “community”  has became ever-more widely and casually used over the years to describe clusters of physically remote individuals interacting collectively online, via an ever-evolving spectrum of technological applications, from ARPANET to the World Wide Web, from bulletin boards to LISTSERVs, from mailing lists to MMORPGs, from blogs to tweets, and from Cyber-Yugoslavia to Six Degrees to Friendster to Orkut to Xanga to Myspace to LinkedIn to Facebook to Twitter to Instagram to whatever the next killer community-building app might be.  I actually wrote a piece about this topic ten years or so ago for the Chapel + Cultural Center‘s newsletter, and at the time I used the following themes and rubrics to frame what community meant to me:

  • An organized group of individuals;
  • Resident in a specific locality;
  • Interdependent and interacting within a particular environment;
  • Defined by social, religious, occupational, ethnic or other discrete considerations;
  • Sharing common interests;
  • Of common cultural or historical heritage;
  • Sharing governance, laws and values;
  • Perceived or perceiving itself as distinct in some way from the larger society in which it exists.

And I think I stand by that today, noting that a “specific locality” or “a particular environment” may be defined by virtual boundaries, rather than physical or geographical ones. But then other elements embedded within those defining traits raise more difficult questions and considerations, including (but not limited to):

  • What, exactly, is an individual in a world where identity is mutable? Is a lurker who never comments a member of a community? Is a sockpuppet a member of a community? Are anonymous posters members of a community? If a person plays in an online role-playing game as three different characters, is he one or three members of the community?
  • How are culture and historical heritage defined in a world where a six-month old post or product is considered ancient? Do technical platforms (e.g. WordPress vs. Twitter vs. Instagram, etc.) define culture? Does history outside of the online community count toward defining said community?
  • What constitutes shared governance online? Who elects or appoints those who govern, however loosely, and does it matter whether they are paid or not for their service to the group? What are their powers? Are those powers fairly and equitably enforced, and what are the ramifications and consequences when they are not? Is a virtual dictatorship a community?

I opined then, and I still believe, that there is a fundamental flaw with online communities in that virtual gatherings cannot fully replicate physical gatherings, as their impacts are limited to but two senses: sight and sound. While these two senses are clearly those most closely associated with “higher” intellectual function, learning and spirituality, the physical act of gathering or meeting in the flesh is much richer, as it combines those cerebral perceptive elements with the deeper, more primal, brain stem responses that we have to taste, touch and smell stimuli. While I’m sure that developers and designers and scientists are working to figure out ways to bring the other three senses into meaningful play in the digital world, a quarter century after I first got online, there’s been no meaningful public progress on that front, and I am not sure that I expect it in the next quarter century either.

Image resolution and visual interactivity get better and better (especially on the virtual reality front), while sound quality actually seems to get worse and worse over time, when we compare ear buds and “loudness war” mixes to the warm analog glory days of tube amps and box speakers — but that’s it, really. And as long as we are existing digitally in only two senses, exchanging messages online removes any ability to experience the physical reality of actually touching another person, be it through a hand-shake, a kiss, a squeeze of the arm or a pat on the back.  The nuances of facial expression and inflection are lost in e-mails and texts, often leading to confusion or alarm where none was required or intended. There is no ability to taste and feel the texture of the food we discuss in a chat room. It still seems to me that the physical act of community building is a visceral one that appeals to, and perhaps requires, all of our senses, not just those that can be compressed into two-dimensions on our computer screens.

I still believe that two-dimensional communities are, ultimately, destined to disappoint us sooner or later for precisely that reason. I have certainly spent countless interesting hours within them — but if you plotted a curve over time, my engagement grows smaller by the year. While people often compare the dawn of the Internet era to the dawn of the printing press era, it’s important to note that the earlier cataclysmic shift in the way that information was preserved and presented (from spoken word to widely-available printed material) did not result in the elimination of physical gatherings, upon which all of our innate senses of community have been defined and built for centuries, as has been the case in the Internet era. Communication happens more readily now, for sure, and communities may be built almost instantaneously, but they’re not likely to have all of the lasting resonances that their traditional in-person counterparts might offer.

I note, of course, that my feelings on this topic are no doubt influenced by the fact that my adulthood straddles the pre-Internet and post-Internet divide. I was raised without computers and cell phones and instantaneous access to pretty much anybody I wanted to connect with, anywhere in the world, so my communities couldn’t be accessed or engaged while sitting alone in my bedroom. I don’t know how many people have been born since 1993, but many (most?) of them, having been fully raised in the digital world, may not be wired (no pun intended) to feel that distinction. And when I continue to make that distinction, they likely see me in the ways that I once would have perceived a grouchy old man shaking his fist and shouting “Get off my lawn, you kids!”

Generational issues aside, I do think that some of the uglier aspects of online communities — bullying, hateful echo chambers, exploitation of weaker members, cruelty hidden behind anonymity — are blights on our culture and our souls, and are having direct cause-effect impacts on the nastiness of our modern social and political discourse. If Twitter and Facebook and other social media sites were shutdown tomorrow, a lot of online communities would cease to exist, sure, but the impact of that global loss of connection would not necessarily be a net negative one. But the genie’s out of the bottle on that front, I suppose, as barring a full-scale catastrophic failure of the global communication network, communities (ugly and beautiful alike) will just emerge in new virtual spaces, rather than those billions of people returning en masse to traditional, in-person community building.

But some of them might. And I might be one of them. I’ve written here before about being “longtime online” and often a very early adopter of new platforms and technologies as they’ve emerged. But somewhere in the past decade or so, I stopped making leaps forward in the ways that I communicate with people and engage with communities online. The next thrilling app just sort of stopped being any more thrilling than the one I was already using, so inertia won out. I bailed on Facebook around 2012, and have used Twitter almost exclusively to communicate online (outside of this blog) between then and last month, when I decided to let that go too.

Beyond social media, I have had several online forum-based communities in which I was very active over the years (Xnet2, Upstate Wasted/Ether [defunct], The Collider Board [defunct], The Fall Online Forum, etc.), and those have mostly fallen by the wayside as well. I’ve retained some very meaningful communications with some good friends from those spaces via email and occasional in-person meetings, but it’s one-on-one connection between us for the most part, and not dialog and group play unfolding in public before the rest of the community. And, again, I think I’m finding it easy to walk away from those public communities, for the most part, because the personal depth of the connections I’ve made gets shallower as the years go on, and even some of the long-term connections just sort of run their courses and stagnate, because there’s really no organic way for the relationships to grow or advance in any meaningful way.

Maybe again this is just a me-getting-older issue, but I get more richness of experience within my communities that exist in real space, and real time, than I used to, and I get less from my online connections. A desire to move more toward that probably played some psychological part in how hard I pushed the word “community” in my professional life, trying to build one there, not only through online means, but also through the scores of conferences that I’ve attended over the years, with tree care scientists and practitioners from around the world. That is a good community. I believe that I have improved TREE Fund’s standing within it. And that feels good.

Part of the cost of doing that, though, was really failing to become part of any meaningful real-world community where I actually lived in Chicago, and also being separated from the little community that means the most to me: my family. A big part of my decision to retire this year was the desire to get that priority inequity better aligned, and I think that as we look forward to our next move as a family, whenever and wherever it is, I’ll be more inclined to put the effort in to make new community connections there, rather than just hanging out on the computer chatting about arcane subjects with what Marcia fondly refers to as my “imaginary friends.”

One of my personal goals for the Credidero (reminder: it means “I will have believed”) project was to spend a month or so considering and researching a given topic, and then exploring how I felt about it, not just what I thought about it, to see if there were some new insights or perspectives for me, perhaps as articles of faith, or different lenses through which to view my world going forward. Somewhat ironically, this month’s “community” topic has been the hardest for me to consider and write, almost entirely because I’ve already spent so much time thinking about it and writing about it over the years that I already have a stronger set of well-formed beliefs on the topic that I’ve had on any of the others thus far.

How I act on those beliefs, though, I think is evolving, hopefully in ways that connect me more meaningfully with a more local or in-person communities, rather than spending so much time alone (in real life) while sort of together (in virtual space). I imagine that retirement, with all the newly available time it entails, will be a much richer experience that way. Less thinking and writing about community all by myself, and more experiencing community with others.

And on that note, I think I’m going to go sit out by the pool and see if there’s anybody to talk to . . .

A community of tree people and cyclists. More fun in person than online!

Note: This article is part of an ongoing twelve-part writing project. I’m using a random online dice roller to select a monthly topic from a series of twelve pre-selected themes. With this seventh article complete, I roll the die again . . .

. . . and next month I will consider Topic Number Four: “Complexity”

All Articles In This Series:

Credidero: A Writing Project

Credidero #1: Hostility

Credidero #2: Curiosity

Credidero #3: Security

Credidero #4: Absurdity

Credidero #5: Inhumanity

Credidero #6: Creativity

Credidero #7: Community

Credidero #8: Complexity

 

Credidero #6: Creativity

When I was in First Grade, our class was introduced to haiku, the classic Japanese form of short poetry structured in three lines, of five, seven and five syllables each. After our teacher explained the concept, read us some examples, and showed how to count syllables (do they still do the thing where you put your hand under your chin while enunciating words?), we were told to take out a sheet of paper (this stuff, remember?) and pen our first haiku. I thought for a few minutes, dutifully jotted one down, counted to make sure the syllables worked out, and handed in my work.

Some time later that day, I was summoned to speak with our teacher, and there were a couple of other grownups behind her desk, though I did not know who they were. Uh oh! Trouble! The teacher asked me where I got the poem I had submitted. I told her I wrote it. She nodded and asked if I wrote it by copying it out of a library book. I said no, I just thought it up and wrote it down. She kept pressing me on other possible sources for the little poem, and I kept telling her that I made it up, following her instructions. She then asked me to define a specific word within the poem, and to explain a specific phrase. I answered her as best I could.

I do not recall or have that entire poem intact anymore, alas, but I do remember that the issues that led my teacher to call in the guidance police and interrogate me on whether I had actually written it or not were my use of the word “ere,” the elegance of my alliterative phrase “gray-green grass,” and the fact that I inadvertently did exactly what a haiku was really supposed to do — juxtaposing images around a seasonal reference — while most of my classmates were writing straight literal odes to their cats or toys or mothers. Clearly, no little country cracker from South Cackalacky should have been able to do that, right?

The thing that finally convinced the teacher and the school-yard Stasi agents that I had not somehow plagiarized the poem was the fact that when I was asked to read it aloud, I pronounced the word “ere” as “err-eh,” (I had only seen it in my favorite poetry book, but had never heard it spoken), so my middle syllable count was actually off by one.  So apparently, I really hadn’t been carrying around a little Bashō masterpiece in my Speed Racer lunchbox just waiting to spring it on someone for my own gain at an opportune moment. I was dismissed and off to recess I went, a bit concerned and confused.

Some time soon thereafter, I was promoted directly from First Grade to Third Grade,  branded smart beyond my limited years, at least in part because of that little poem. I learned three key things from this confluence of events:

  1. Being creative can cause trouble
  2. Being creative can open doors to opportunity
  3. People correlate artistic creativity with full spectrum intelligence

I also picked the Beloved Royals to win the 1978 World Series. Wrong again.

I have never stopped writing since then, for pleasure and for pay. My first real job was as the “Teen Editor” of the Mitchel News when I was 13 years old, per the photo at left. I was supposed to be a beat reporter, doing interviews with new kids and documenting the things in which the young people on the military base where I lived were presumed to be interested. But after a column or two like that, I got bored, and I started doing record reviews of Jethro Tull and Steely Dan albums instead, or handicapping sports seasons (Remember the NASL? I went out on a limb and predicted that my beloved Washington Diplomats would win the 1978 title. I was wrong), or writing creepy poetry about some of the weird old buildings and spaces on the base. I also gave the way-popular movie Grease a thumbs-down, one-star, bomb review, eliciting howls of rage from most every girl I knew on base, and the guys who wanted to impress them. That might have been the bridge too far.

I was fired from the job after about a year, because my creative voice was not the creative voice that the newspaper’s publisher desired me to have, or at least to create, for his readers. So I learned a lesson there, too: creative writing and “technical writing” (for lack of a better term) were not the same thing in the eyes of publishers and readers, and there was a difference in the things that I wrote for myself and the things that I wrote for others. (Well, I also learned, much to my chagrin, that my cushy desk job writing stuff was a whole lot easier than cutting grass, delivering newspapers, clearing brush, or washing dishes, all of which I had to do at one time or another to earn my scratch after I was chucked out at the press room, alas.)

That distinction between creative and “technical” writing is true to this day, of course, although then and now, there’s always been one thing that sort of sticks in my craw when I actively ponder it, and it’s the fact that I do not perceive either form of writing as requiring any more of less creativity than the other, though only gets called “creative.” One type might require research, one type might not. One type might require the creation of characters, one type might require creatively describing real characters. One type might require me to write in dialog or dialect, creating a voice for other people to speak, one type might require me to speak in a voice other than my own to appeal to the whims and tastes of the people or organizations who pay me for my efforts. But they all require creativity.

I enjoy both types of writing, in different ways, and in different times. When I sit down with a blank page or a blank screen before me, and I get up some time later leaving a big blob of words behind, I feel that I have created something. It may be a newsletter column, it may be a poem, it may be a story, or it may be a Credidero article. All equal in my mind in terms of their worth (though I know from experience that in the eyes of those who pay for my blank paper and computer screen, my “technical” writing is of far greater value), all requiring the same basic sets of linguistic skills, all involving creativity. I have to start with an idea, I have to formulate structures around the idea, I have to put the idea into a meaningful context, I have to deploy resources to ensure the idea is valid, I have to find the right words to share the idea with others. When that’s done, I have made something, from nothing. Voila, creation!

Interestingly enough, this concept that humans can be creative and practice creativity is actually a shockingly modern one in Western Culture. In Plato’s Republic, the wise old Greek philosopher is asked “Will we say, of a painter, that he makes something?” and he replies “Certainly not, he merely imitates.” For centuries, the widely-held philosophical belief was that only God could create something from nothing — creatio ex nihilo — while we mere mortals were stuck with imitating, or discovering, or making, or producing, oftentimes guided by inhuman forces, like Muses, or Daemons, but certainly not creators in our own right. Poetry was one of the first arts to be granted license to be seen as an act of creation, with other written texts, and other arts genres following in the 19th Century. It was only in the early 20th Century when Western people begin to speak of “creativity” in the natural sciences and technical fields. So whether they realize it or not, the creative folks today who (somewhat pretentiously, to these ears) call themselves “makers” instead “artists” are actually invoking a 2,000-year old paradigm that (again, to these ears) devalues the fruits of their labors, rather than (as likely intended) anchoring them as somehow more “real” or pragmatic that the stuff that the arsty-fartsy types produce.

I find the long and wide-spread cultural reluctance to embrace and value creativity as a deeply personal, deeply human, deeply individual endeavor or trait to be fascinatingly awry with own beliefs and experiences on this front. When I interview for jobs or freelance assignments, or even in just regular conversations (because I’m like that) and I am asked what I consider to be my greatest skill, I always default to “communications” — I am a story-teller, at bottom line, and I can make people believe in my organizations’ work and mission, by talking to people, and by writing to people. If you hire me, I’ll do the same for your organization!

I create narratives on the job, sometimes from scratch, sometimes from consolidation of extant texts, but there’s a deep creative element to both of those activities. I also tell stories on my own time, fictional ones, poetic ones, accounts of events yet to come, documentation of events gone by. A concert review is a story. A fundraising appeal is a story. A speech is a story. An explanation of a research finding is a story. I am a story-teller. And I would not be a story-teller if I did not possess a finely tuned sense of creativity, and a desire and skill to create something that did not exist until I turned my mind to it. I can’t imagine my mind, in fact, if it was not anchored in that desire to create stories. I have done lots of creative things in various creative fields over the years (a bit more on that later), but my stories and my writing are as intrinsically me, myself and I as anything else I do.

I generally think of this a good thing, though research may indicate otherwise. A study involving more than one millions subjects by Sweden’s Karolinska Institute, reported by the Journal of Psychiatric Research in 2012, found that, as a group, those in the creative professions were “no more likely to suffer from psychiatric disorders than other people, although they were more likely to have a close relative with a disorder, including anorexia and, to some extent, autism.” Phew! But wait . . . within that surveyed cohort of creative types, the researchers did find that writers (!) had a “higher risk of anxiety and bipolar disorders, schizophrenia, unipolar depression, and substance abuse, and were almost twice as likely as the general population to kill themselves” (!!) and that dancers and photographers were also more likely to have bipolar disorder.

Well, uhhh . . . yeah. That. Kettle, pot. Pot, kettle. We writers are apparently the most damaged creatives, the Vincent van Goghs and Roky Ericksons and Francisco Goyas and Nick Drakes of the world notwithstanding. For the record, if I look back at the my family tree just for a couple of generations, and across a couple of close cousin branches, every single one of those disorders appears, often multiple times. So is the creative drive that I think of as a gift or a blessing actually just a side symptom of a spectrum of mental illnesses passed on to me as part of my innate genetic code?

Maybe. The suffering artist stereotype probably didn’t emerge without evidence, after all, and when I think about the periods in my life when I was most floridly, obsessively creative (not just in writing), they probably do correlate closely with the periods when I felt the most vulnerable or damaged. Being driven can be a good thing. And being driven can be a bad thing. Beautiful stories can emerge from dark spaces, and dark narratives can emerge from a happy place. Keeping busy and being productive can be cheap forms of self-administered therapy, or they can be positive manifestations of a mind well, truly, and happily engaged.

This way madness lies.

I think of big part of managing my own creativity is being self-aware enough, through long experience, to know which of these conditions are in play at any given time, and to create accordingly. In the 1990s, to cite but one time-place-topic example, I wrote a novel, called Eponymous. It was about a dissolute musician-writer named Collie Hay, beset with a variety of substance abuse issues and mental health problems, written in the first person, explaining why and how the writer wrote, and knitting a narrative about how creation and destruction often go hand in hand. The epigraph of the book, the very first words you read when you open it, say “For the Demons . . . Fly away! Be free!” The very last page of the book, in the author’s bio, says “J. Eric Smith is not Collie Hay. Honest.” But, uhhh, y’know. Of course, I’d say that. I’m a writer, and not to be trusted.

One review by a trusted colleague who was also a damaged writer type noted “Eponymous is a hall of mirrors. J. Eric Smith, the author, who’s been an upstate New York rock critic, has written a book about an upstate New York rock critic who is himself writing a book. The book-within-a-book device is hard to pull off, but when it works (see Tristram Shandy and Adaptation) — and it works well here — it’s lots of fun.” And it starkly lays bare the correlations, at least in my case and in my mind, between written creativity and dysfunction, without even really bothering to explain them, since they just seem(ed) to me to go hand in hand, as a given. It also felt good to write, and some demons did, in fact, flitter off as result of having written it. Poof! Therapy!

(Side Note #1: If you want to read Eponymous — and 20+ years on from creating it, I’m not sure I’d really recommend it to you — find a cheap, used print version of it, not the Kindle version. It was adapted as an e-book without my permission or supervision, and a lot of the internal formatting [there’s poems, lyrics, other stuff of that ilk] got messed up and is very difficult to read on an e-book reader. I don’t make a penny from it either way, so it’s your call if you want to score it, but I just don’t want you to have to struggle with a nasty mess of on-screen gobbledygook if you do wade into the thing).

While I’ve focused my personal observations here on writing, I should note that I have been and remain creative in other ways too. I’d claim photography as the visual art form in which I have the greatest interest and skill, and I’ve been a songwriter, singer, musician and lyricist over the years as well, though mainly in my younger days. Until precisely 1993, in fact, I would have cited music as my primary creative skill, and the one which I was most willing, able and likely to achieve success (critical, if not commercial) over the years.

How can I date that end point so accurately? That was the year when we got our first home computer and I got online for the first time. My home recording studio began gathering dust soon thereafter. For a long time after that when people would ask about how my music was going, I’d say “I’m in remission as a musician these days,” so once again, even way back then, I was lightly associating creativity with illness, even if I laughed it off in so doing. But we all know that every joke has a bit of truth behind it.

(Side Note #2: If I could snap my fingers and have any job in the world, hey presto, right now, I would want to be the principal, non-performing lyricist for a commercially and critically successful musical act. The Bernie Taupins, Roberts Hunters, Peter Sinfields, and Keith Reids of the world have got good gigs! I have had the pleasure of having my poems recorded as lyrics by a few musicians, and it’s deeply satisfying, let me tell you. That’s likely to be one of the areas I’m going to focus creative attention on when I retire from my last day job accordingly. Maybe the group or singer that hires me would let me provide the photographs for the album covers and promotional materials too. A fellow can dream, right? Even a broken creative fellow?)

So creativity has touched and shaped my life in a variety of ways that fall under the “artistic” umbrella, but the amount of time I spend on those pursuits pales in comparison to the amount of time I spend at my job. Sure, writing and speaking and story-telling are cornerstones to that, and as noted above, and I feel that those facets of my professional work are every bit as anchored in my core, latent sense of creativity as are my most absurd and fantastic pieces of fiction and poetry. But just as the concept of creativity evolved and was adapted in the early 20th Century to include the sciences and technical endeavors, the latter part of the Century saw the definitions expanding further into organizational, operational, and business dynamics, and the ways that groups use creativity to build and sustain corporate culture.

Look at tech behemoths like Apple, Microsoft, Amazon, Google and Netflix, just off the cuff. Sure, there were certainly some blindingly creative people within their corporate structures who made the technical products and business services they provide, but they would not be what they are today without other visionaries imagining, designing, and implementing completely new business models and marketing approaches unlike anything seen or experienced before them. The brains behind those new business models were certainly engaged in forms of creativity, making new and valuable things (many of them concepts, not stuff), filling market spaces that nobody knew existed before they dreamed them up and monetized them. If you had told me when I was a teenager that by my 55th birthday I’d be able to listen to any song in the world while watching a movie, doing my banking, typing to a pen pal in Australia, and playing a video game, on my phone, at the beach, all at the same time, I’d have said you were watching way too much Jetsons for your own good. That’s just silly, Astro.

The transformative nature of the tech sector means that much of the recent and current research into and writing about creativity in the workplace focuses on organizations and individuals within the Silicon Valley sphere of companies, because the cause-effect-impact relationships there are easy to identify, evaluate and explain. But the work that many of us do in non-technical sectors can involve just as much creativity, and can have just as transformative an impact within our organizations, or within the smaller external spheres in which we operate. I’m confident that 100 years from now, the types of activities that are granted “creative” status by default will expand to include countless more fields and activities, many of which are unknowable or inconceivable today, even in the creative minds of the most brilliant futurists.

But maybe we shouldn’t wait 100 years to afford “creative” status to certain endeavors that aren’t seen as “earning” it today. We’re all creative, each in our own ways, every time we produce something that wasn’t there before we cast our hands above the waters and say (to ourselves) “Let there be a thing,” whether anybody else knows we did it or not, whether it has any use or value at all to anybody, whether it can be experienced in the world of the senses, or only within the spheres of our minds. We may create alone or with others. We may create to heal ourselves or hurt ourselves, others likewise. It may feel good, or it may feel bad. We may intend to create, or we may create by happy accident. It’s all the same: “Let there be a thing,” and there will be, and sometimes it might even be really, really good.

Creatio ex nihilo was long the sole province of God, or the Gods, or Muses, or Daemons, or other inhuman forces swirling in the vapors around us. Maybe by claiming creativity as our own human right, in all the things we do, and celebrating its fruits, we don’t denigrate the God(s) that inspire us, but instead become ever more like them.

Note: This article is part of an ongoing twelve-part writing project. I’m using a random online dice roller to select a monthly topic from a series of twelve pre-selected themes. With this sixth article complete, I roll the die again . . .

. . . and next month I will consider Topic Number One: “Community”

Deftly using every single one of my creative skills here in coherently explaining the recorded canon of the great UK band Wire. It made sense at the time . . .

All Articles In This Series:

Credidero: A Writing Project

Credidero #1: Hostility

Credidero #2: Curiosity

Credidero #3: Security

Credidero #4: Absurdity

Credidero #5: Inhumanity

Credidero #6: Creativity

Credidero #7: Community

Credidero #8: Complexity

 

Credidero #5: Inhumanity

When I was framing my Credidero writing project for 2019, I spent a fair amount of time and thought developing the list of 12 one-word topics that I intended to ponder and write about over course of the year. I clearly remember that the very last edit that I made to the final list was turning the possible topic “Humanity” into the actual topic “Inhumanity.” At the time I did that, the edit was largely to eliminate what I thought could have just produced a fuzzy wuzzy glurgey article, and to better balance what I considered innately “positive” reflections among the list of 12 with what I thought could be innately “negative” reflections, and thereby perhaps more intellectually difficult to parse. In other words, it seemed to me at the time that it would be easier for me to fall into writing piffle and tripe about the concept of “humanity” (“La la la, let’s all get along, people are great, yay!”) than it would be to consider its opposite, which could far more fraught with dark feelings and unpleasant realities and difficult realizations. Those seemed to be the spaces that would be more challenging for me to explore, and since this project was intended to be a personal challenge, that final edit stuck.

My sense of that potential challenge has proven accurate for me over the past month, and this is the first of five scheduled articles to date where I’ve missed my self-imposed monthly deadline (by just a few days, though) as I knocked the topic around in my brain housing group a bit longer than I have earlier installments before finally sitting down to organize my mental noise and write. One of the key difficulties for me has been that this topic is ultimately defined by its negation: you can’t actually consider or define “inhumanity” without considering and defining “humanity” first: basically and etymologically speaking, “inhumanity” is simply the absence of whatever its opposite is. Then, adding complexity, “humanity” itself carries two simple/common uses, one of a noun form (“the human race; human beings collectively”), and one of an adjective form (“humaneness; goodness; benevolence”).

But here’s the rub: by most objective measures, on an aggregate, macro, long-term, global basis, humanity (noun) does not practice humanity (adjective) very effectively, at all. Our history is shaped, defined and recorded not through long eras of benevolence and kindness and care, but rather through endless, constant, unrelenting war, subjugation (of our own species and others), depredation of resources, conquest and assimilation of assets, and a host of other day-to-day and century-to-century activities that skew far from the concepts of compassion, tolerance, goodness, pity, piety, charity and care that are embodied in the common adjectival use of the word “humanity.”

It’s almost like humanity (noun) hired some supernatural marketing agent to spin the emergence of the English word humanity (adjective) in the 14th Century just to make us look good and gloss over the horrors imminent whenever we rolled up into your forest or savanna or jungle or oasis. “Oh hey, look, it’s the Humans! Word is, they are kind and benevolent! Humanity, Huttah!” (Said the Lemur King, before we turned him into a shawl, and then killed all the other humans who didn’t have any Lemur Wear).

I kept knocking this conundrum around, not really getting anywhere with it, until it occurred to me that maybe I needed to parse the word “inhumanity” a bit differently. In its common modern form and usage, we think of it in these two parts: “in + humanity,” i.e. the absence of humanity (adjective), and we generally take the combined word form to be a bad thing, even though “humanity” (noun) is pretty awful, if we’re frank about our shortcomings. Perhaps a better way to consider the subject word, though, is to parse it thusly: “inhuman + ity,” i.e. simply the state of being not human. Plants exist in a state of inhumanity, when defined that way, and there’s no value judgment assigned to saying that. They are not human. Fact. Likewise all of the other gazillions of species of animals, protists, fungi, bacteria, viruses, and unseen and unknown entities (ghosts? angels? demons? gods?) that share our planet with us. Not human, and all existing in a state of inhuman-ity accordingly.

The collective set of inhuman-ity engages in many of the same practices that humanity (noun) does: its species and members eat each other, cause diseases, go places they shouldn’t, do harm, deplete things, despoil things, over-populate, over-hunt, over-grow, overthrow, fight, bite, copulate, propagate, locomote, respirate, expire. But their collective abilities to break things on a truly planetary basis, and their willful, ongoing, compulsive drives to engage in the mass exterminations of creatures of their own kind in pursuit of meaningless shiny things or mythical mental constructions or physical homogeneity all pale in comparison to the wide-spread horrors which Homo sapiens is capable of, and seemingly revels in.

Should some miracle suddenly and completely remove the biomass of human beings from our planet today, the other living and unliving things upon it would likely, in time, return to some slowly evolving state of punctuated equilibrium that would allow life to continue in perpetuity here, occasionally blasted and reorganized by comets or asteroids or other stellar detritus, until the time, billions of years from now, when our star explodes and incinerates the little rock we all call our home for the final time.

But I believe it would challenge the most optimistic human futurists to consider the trend lines of what our own species has wreaked upon our planet since we emerged from East Africa, and imagine a (mostly) peaceful progression where humanity (noun) practices care, compassion, kindness, and goodness in ways that promote planetary well-being, human dignity, respect for all species, equality, justice, brotherhood, and the peaceful distribution of assets to all who need them for billions and billions of years.

The accepted scientific consensus, in fact, is quite the opposite of that: the actions that a relatively small number of human beings in positions of political and economic power are taking, right now, largely for the short-term material gain of their own cliques and cabals, are forging an irreversible glide path toward long-term global human suffering that will be orders of magnitude greater than any experienced in the history of our species, and that will more than offset the benefits of centuries of scientific gains, e.g. disease mitigation, crop yield and nutritional density, etc. It’s bad, and it’s going to get worse, fast, and it’s going to take down a huge percentage of the collective set of inhuman-ity on its way. Our current government, to cite but one illustrative example, doesn’t even want its scientists to publicize climate forecasts beyond the year 2040, because the list of probable futures are so dire beyond that point. But they’re coming, whether they write about them or not.

Where does any classical sense of humanity (adjective) as a state of goodness and grace fit within that reality, and how does one reasonably envision the human species a thousand years or more years from now as anything but, at likely best, a small rump tribe of survivors holding on meanly after the vast majority of our species has perished, horribly?

That may be the inevitable progress of our unique species, and it is inherently human accordingly, but it’s certainly not inherently humane. So the linkage between those two uses of the word “humanity” grows more and more difficult for me the longer I think about them, because it really seems to me that, on many planes, humanity is at its most humane when it is being its most in-human, and humanity is at its most inhumane when it is being its most human. Oh, the humanity! Look how inhumane it is!

I suspect alien interplanetary observers would come to the same conclusion, and then might want to hire that supernatural 14th Century marketing firm themselves before they head onto their next assignations: “Oh, hey, look, it’s the Aliens! Word is, they are decent and good and fair, despite their different colored skin and hair! Aliens, huttah!” I mean, we English speakers are just really full of ourselves and bursting with linguistic bullshit when we use the very word with which we name ourselves as a synonym for all the goodness in the world, right? It boggles this already boggled mind.

Let’s pause and take a deep breath at this point. I certainly appreciate that this is high-minded rant that I’m embarking on here, and I certainly do not wish to imply that I am any better (or worse)(or different) that the rest of humanity, by any stretch of the imagination. While I may not be one of the 100 Human Beings Most Responsble for Destroying Our Planet in the Name of Profit (if you click no other link here, click that one, please), there’s still plastic in my trash can, and dead animal parts in my refrigerator, and hatreds in my heart, so I would not set myself up as any sort of paragon of the ways in which human beings can, actually, be and do good. I’m doing my part to destroy the planet too, whether I want to or not, because I am human, and that is what we do, collectively.

But even in the face of our unrelenting, unstoppable destructive natures as a collective, there are individuals around us who do good, and act benevolent, and show kindness, and practice care, representing that classical 14th Century marketing sense of the word “humanity” in their everyday lives and activities. There might even be some small groups of people who can do that on a consistent basis over time, but I think that it is an inherent flaw in our in species that when you put too many of us together, by choice or by circumstance, we become inhumane to all those beyond the scopes and spheres where our individual perception allows us to see individual goodness shining more brightly than the collective awfulness to which we inevitably succumb. Jesus’ teachings were sublime and profound, to cite but one of many examples. But most churches that attempt to teach and (worse) enforce them today are horrible and cruel, with the largest ones usually being the most egregious, inhumane offenders.

How many humans do you have to put together into pockets of humanity before our innate lack of humaneness emerges? I would suspect the number aligns with the size of the average hunter-gatherer communities before we learned to write and record stories about ourselves, and then justify why one communities’ stories were better or more correct than its neighbors’ were, and I suspect that in our twilight years as a species, after we’ve mostly destroyed the planet, we’ll rediscover that number again as we cluster in small groups, optimistically in our ancestral savannas, but more likely in the wreckage of our great cities, after they have spread to cover the inhabitable surface of the Earth, and implode upon themselves.

I found close resonances in my emergent thinking on this topic in the writings of Sixteenth Century French philosopher Michael de Montaigne, most especially in his essay “On Cruelty,” where he wrote:

Those natures that are sanguinary towards beasts discover a natural proneness to cruelty. After they had accustomed themselves at Rome to spectacles of the slaughter of animals, they proceeded to those of the slaughter of men, of gladiators. Nature has herself, I fear, imprinted in man a kind of instinct to inhumanity.

Along similar lines, St Anselm of Canterbury has served as a sort of a quiet mascot for me in this series, since I indirectly captured the word “credidero” from his writings, and I’ve been researching some of his major works in parallel with this project. In De Casu Diaboli (“On The Devil’s Fall,” circa 1085), Anselm  argued that there are two forms of good — “justice” and “benefit” —and two forms of evil — “injustice” and “harm.” All rational beings (note that Anselm is including the inhuman angelic caste in this discourse) seek benefit and shun harm on their own account, but independent choice permits them to abandon justice. Some angels chose their own happiness in preference to justice and were punished by God for their injustice with less happiness. We know them now as devils. The angels who upheld justice before their own happiness, on the other hand, were rewarded by God with such happiness that they are now incapable of sin, there being no happiness left for them to seek in opposition to the bounds of justice. Poor humanity (noun), meanwhile, retained the theoretical capacity to choose justice over benefit, but, because of our collective fall from grace, we are incapable of doing so in practice except by God’s divine grace, via the satisfaction theory of atonement.

At bottom line, then, Anselm ultimately found humanity collectively damaged, closer in temperament to devils than angels, and salvageable only by the intervention of the humane, though in-human, God of Abraham, and his Son, who became human, so that other humans could kill him. As humans do.

These two quotes eventually carried me back to the very first thing I typed on this ever-growing page over a month ago, and likely the very first thought that you as a reader had, when presented with the word “inhumanity:” the oft-stated concept of “man’s inhumanity to man,” which has become something of a cliche through over-use. Do you know where the phrase comes from? I didn’t, though I guessed it was likely Shakespeare, since so many eloquent turns of phrase of that ubiquity in our language come from his works.

My guess was wrong, though: it was first documented in English in Man Was Made to Mourn: A Dirge (1784) by Robert Burns:

Many and sharp the num’rous ills
Inwoven with our frame!
More pointed still we make ourselves
Regret, remorse, and shame!
And man, whose heav’n-erected face
The smiles of love adorn, –
Man’s inhumanity to man
Makes countless thousands mourn!

Burns, too, accepts at face value the inherent awfulness “inwoven with our frame,” and notes that through our active choices and actions, we’re more than capable of becoming ever more awful yet.

Once again, we return to to the linguistic twist: humanity is at its most humane when it is being its most in-human, and humanity is at its most inhumane when it is being its most human. Can we extrapolate a statement of action from that conundrum thusly: the only way we will save humanity is to reject humanity, and the only way we will be humane is by being inhuman?

It seems and sounds absurd to frame an argument in those terms on the surface, and yet . . . human beings readily embrace the inhuman when they pray to God and ask his Son to come live in their hearts, guaranteeing their eternal salvation, and human beings embrace the inhuman when they accept a Gaia concept of the planet as a single living entity upon which we are analogous to a particularly noxious strain of mold on an orange, and human beings embrace the inhuman when we look to the cosmos around us in the hopes that we are not alone, and that whoever or whatever is out there might yet save us.

I could rattle off dozens of other examples of the ways in which humans embrace the inhuman in the hopes becoming more humane, and all of them carry more than a whiff of deus ex machina about them, as they all involve elements from outside a system being injected into a system to sustain a system. But you know what? That feels right to me. That feels like the one viable path out of an endless do-loop of inhumane humanity, and I suspect that’s why all cultures, throughout our history, have created stories and religions and narratives that seek to guide humanity’s future through the examples of non-human actors, be they other living things on our planet, or mystical beings beyond it.

I doubt that any one of them is any better or any worse than any other, so long as they focus individuals and small groups (remember, we get horrible en masse, always) on goodness at a scale perceivable to the perceiver, and receivable by a receiver. Maybe this explains why I feel compelled to speak out loud to animals when I meet them on my perambulations, as just a small personal act of embracing the inhuman around me, perhaps creating feelings and moods and resonances that might then make me a better human being to the other human beings with whom I interact. Maybe Marcia and Katelin embrace the inhuman in similar ways through their yoga practice. Maybe you embrace the inhuman in a church, or a temple, or a mosque, or in a forest meadow, or atop a mountain, or beneath the eyepiece of a massive telescope.

Maybe we all become better, more humane humans, the more we embrace the inhuman-ity around us. It’s a squishy proposition, sure, but my gut tells me it’s the right one . . .

Note: This article is part of an ongoing twelve-part writing project. I’m using a random online dice roller to select a monthly topic from a series of twelve pre-selected themes. With this fifth article complete, I roll the die again . . .

. . . and next month I will consider Topic Number Seven: “Creativity”

“Nice flowers, Burns. Gimme ’em, or else I’ll sock you one . . . “

All Articles In This Series:

Credidero: A Writing Project

Credidero #1: Hostility

Credidero #2: Curiosity

Credidero #3: Security

Credidero #4: Absurdity

Credidero #5: Inhumanity

Credidero #6: Creativity

Credidero #7: Community

Credidero #8: Complexity

 

Credidero #4: Absurdity

My father was born and raised in Albemarle, a North Carolina Piedmont mill and rail town near the Uwharrie Mountains. He left there after college to embark on a long and successful Marine Corps career, living and traveling around the world, but his parents stayed on in the same house on Melchor Drive until they died, Papas before Grannies, both passing when I was in my twenties.

While I never lived in Albemarle, I had two decades’ worth of grandparent visits there, with many fond memories still held dear of those mostly gentle days. Until I developed teenage cynicism and ennui, one of my favorite things about going to Albemarle was hunkering down in a comfy chair to read my grandmothers’ copy of The Golden Treasury of Poetry, edited by Louis Untermeyer. I have that battered copy of the book to this day, as my aunt gave it to me after my grandmother died, knowing that no one else had ever read or loved it as much as I did.

(Amusing [to me] side note: The book was given to my grandmother by her friend, who everyone called “Miz Doby,” in June, 1966. I opened it today and looked at the front-piece inscription and smiled to realize that I still do not know what Miz Doby’s first name was, since she just signed it “E. Doby.” They were both elementary school teachers, so presumably the book was originally intended for my grandmother’s students, before I laid claim to it).

As is often the case with big hard-covers that are regularly handled by children, the spine of the book is cracked, there are stains throughout it, and it’s clear to see where the most-loved, most-read pages were, as they’ve been bent back, breaking the glue that held the pages to the spine. If I just set the Untermeyer book on its spine and let it fall open as it will, it drops to pages 208 and 209, containing Lewis Carroll’s “Jabberwocky” and “Humpty Dumpty’s Recitation.” If I flip to other broken-open pages, I see these poems:

  • “The Owl and the Pussy-Cat” and “Calico Pie” by Edward Lear.
  • “The Tale of Custard the Dragon” by Ogden Nash
  • “Old Mother Hubbard” by Sarah Catherine Martin
  • “The Butterfly’s Ball” by William Roscoe
  • “How To Know The Wild Animals” by Carolyn Wells
  • “Poor Old Lady, She Swallowed a Fly” by Unknown

Some of these poets and some of the poems are better known than the others, but they all do share one prominent recurring similarity: they are all nonsense verses, rhythmically engaging to the ear, deeply earnest in laying out terrific tales without any meaningful anchors in the real world whatsoever. They and others like them could readily be described as “absurdities,” which my desktop dictionary defines as “things that are extremely unreasonable, so as to be foolish or not taken seriously.”

I can still recite “Jabberwocky” by heart half a century on, and my early love of the absurd has pervasively infused both the inputs into my intellectual development, and the outputs of my own creative work, throughout my entire life, and likely through however many years I have remaining before me.  Indulge me three examples on the output side, please: these are short poems that I wrote when I was in my 30s or 40s, clearly related to, and likely inspired by, the doggerel, wordplay, and rhythmic whimsy of those gentler children’s poems in the Untermeyer collection:

“Tales of Brave Ulysses S. Vanderbilt, Jr.”

I don’t know how to make this damn thing go
James Monroe won it in the hammer throw
Won it very long ago
Won it in the hammer throw

Time goes by while we’re learning how to fly
William Bligh dreamed of sour rhubarb pie
Dreamed it with his inner eye
Dreamed of sour rhubarb pie

On the sea, Bligh and Monroe sail with me
One degree south of Nashville, Tennessee
South of Rome and Galilee
South of Nashville, Tennessee

Home at last, feeling like an age has past
Thomas Nast drew us through his looking glass
Drew us as we crossed the pass
Drew us through his looking glass

I don’t know how to make this damn thing go
Even so, sell it quick to Holy Joe
Sell it painted red Bordeaux
Sell it quick to Holy Joe

Sell it with a piping crow
Sell it for a load of dough
Sell it at the minstrel show
Sell it, man, and then let’s go

“Field Agents”

“Let him out, he’s coming now, he’s alone,”
(I can not tolerate the taste of this megaphone).
Deep in the coop, the fox, he sees that some hens have flown,
his cover’s blown, (tympanic bone, Rosetta stone).

And then the hawk drops down from his perch on high,
(spearing the fox through, he lets out a little cry),
Justice is quick here, we stand and we watch him die,
I dunno why (fluorescent dye, blueberry pie).

We pull the poor poultry out from the killing floor
(some of the pups get sick there in the feath’ry gore),
out on the lawn, we stack them up and note the score:
it’s twenty-four (esprit de corps, espectador).

Back in the barn, now, safe in our little stalls
(I watch those damn bugs climbing around the walls),
We sleep and eat hay, waiting ’til duty calls,
as the time crawls (Niagara Falls, no one recalls).

“Natural History”

The ammonites farmed with diazinon
to kill eurypterids beneath the soil.
Which perished there in darkness ‘neath the lawn,
but rose in eighty million years as oil,
which dinosaurs refined for natural gas
to cook their giant land sloths on steel spits.
As sloths were butchered, forests made of grass
rose from the plains to hide the black tar pits,
where trilobites would swim to lay their eggs.
Their larvae flew and bit the mastodons,
while tiny primates scampered round their legs,
feeding on the fresh diazinon.
At night, the primates fidget as they dream
of interstellar rockets powered by steam.

What do these, or the many other poems like them that I have written over the years, mean? Damned if I know. But damned if I also don’t think that they provide better insights into my own psyche and mental processes than the more lucid prose I write professionally and for pleasure. My brain’s a messy thing, and there’s a lot of stuff going  on inside it that doesn’t make a bit of sense, but which nevertheless consumes a fair amount of internal attention and firepower. These absurd little nuggets spill out of my brain easily and frequently, and I enjoy extracting and preserving them. They seem to reflect a particular lens through which I often view the world: it’s astigmatic, has finger-prints on it, is lightly coated with something greasy and opaque that can be rubbed around but not removed, and there are spider cracks latticed throughout its wobbly concave surfaces.

So many of my tastes in the various arts align closely and clearly with this warped view of the world, as though my internal center of absurdity vibrates in recognition and appreciation when presented with similarly incongruous external stimuli. Examples: I have been drawn to surrealist paintings since early childhood, I regularly read books in which language and mood are far more important than linear plot or narrative, and I once did a little feature on the films that move me most, titled: My Favorite Movies That Don’t Make Any Sense At All.

I must admit that since rolling the online dice three weeks ago to decide which of my Credidero topics I would cover this month, I have had to repeatedly tamp down the very strong urge, prompted by the word “absurdity,” to merrily write 3,000+ words of absolutely meaningless gibberish wordplay and call it “done,” rather than actually considering what “absurdity” really means, and processing what I really think and believe about it. And that initial, innate reaction to just be absurd, as I do, has made this a more challenging topic for me to write about than ones that have come before it. Whenever I thought about how to frame the narrative, I always found myself in some sort of “eyeball looking at itself” scenario, an impossible infinite do-loop of self-reflection where I know the mirror and the object reflected within it are both irregularly warped and pointed in different directions, and I don’t (and can’t) quite know what the true image is.

I must also admit that this isn’t the first time I’ve reflected on such matters, even without the formal structure of a public writing project. I have long found that the easiest way to break out of a wobbly self-reflective do-loop has been to create and export a new loop, so I can look at it from the outside, not the inside. When I read the poems reproduced above today (and there are a lot like them in my collection), they strike me as relics of just that type of act or urge: I wrote them as absurdities, I see them as absurdities now, I embrace those absurdities, I know that I created those absurdities, I know that the act of creating them was absurd, and that any attempt to explain them would be equally absurd.

But at least those bits of absurdity now reside outside of me, self-contained and complete, where I can see them more clearly, rather than having them whirring on blurry spindles within me, occasionally shooting off sparks that ignite other bits of weird kindling lodged along the exposed and frayed wiring of a gazillion neurons packed inside my skull. They mean nothing to me objectively, but they mean everything to me subjectively, because they’re so closely aligned with the ways that I think, and what I think about, and how I view the world around me — or at least how I view some world around me, even if it’s not the one I actually live in.

Pretty absurd, huh?

When I do try to order my thoughts on this topic in ways that can be meaningfully communicated to others, I’m struck by the fact that many of the poems in Untermeyer’s great poetry collection for young people are just as absurd as mine are, and just as absurd as the playground chants that kids around the world somehow seem to learn by osmosis, or the songs we sing to little ones, or the goofy talking animal imagery of countless children’s films and television shows. Utterly absurd! All of it, and all of them! But they love it, don’t they, and we seem to love giving it to them, don’t we? When we describe the whimsy of those ridiculous art forms as “absurd,” we imbue the word with fun, and frolic, and laughter and light. Look at the smiles! Look at them! Joy!

Then minutes later, we turn from our young ones, and we check our Twitter feeds or pick up news magazines or turn on our televisions and are confronted with words, actions, or events precipitated by political figures with whom we disagree, and we may scowlingly brand their actions or activities as “absurd” with vehemence, and bitterness, and anger, and darkness in our hearts. Absurdity is somehow colored in different hues when it manifests itself in real-world ways outside of the acts of the creative class, or outside of the bubble of childhood. And rightly so, as is most profoundly illustrated in our current political clime, where elected or appointed public figures routinely engage in acts or spew forth words that are (to again quote the dictionary) “extremely unreasonable, so as to be foolish or not taken seriously.” 

It is to our own peril, unfortunately, when we don’t take such manifestations of public, political absurdity seriously. Talking animals don’t kill people. Absurd public policies do. Nonce and portmanteau words don’t break people’s souls. Propaganda and hate speech do.  Surrealistic imagery does not poison minds. Unrealistic demagoguery does. Absurd fantasy stories about non-scientific worlds do not destroy the real world. Absurd fantasy policies anchored in non-scientific worldviews do — and there is only one real world within which they function and do harm, no matter how fabulously untethered their sources may be.

People with severe mental illness may act publicly in absurd ways, and we sympathetically view that as a part of their pathology. But what are we to make of people without such pathologies who consciously, actively engage in absurd behaviors specifically designed to remove value and meaning from the lives of others? I’d move them from the absurd pile to the evil pile, frankly. And we’d all be better off were we to rid ourselves of their noxious influences, which is why the fact that 50%+ of our country-folk don’t bother to vote at all is, in itself, utterly absurd.

There’s a vast repository of philosophical thought and writing (from Camus and  Kierkegaard, most prominently) dedicated to understanding absurdity and the ways in which it manifests itself in our lives, and how we are supposed to respond to or function in its grip. Not surprisingly, the philosophy of absurdism is built on the same “dark” theoretical frameworks as existentialism and nihilism, where there is a fundamental conflict between our desire to imbue our lives with value and meaning, and our inability to find such objective worth within an irrational universe that has no meaning, but just is. Once again, the nonsense that is charming when fictionalized for children is often appalling when framed as the architecture within which adult humans function. Why try, when in the end we all die, and we will never know why?

It’s easy for me to embrace and understand my own sense of inner absurdity as an adjunct to the whimsical absurdity of youth, but not so easy to reconcile my inner landscape with the often awful external vistas associated with public, political, and philosophical absurdity. Can I love one and hate the other, or is that in itself an absurd mental position? Is there meaning to be found between those poles, or is life just a pointless, endless Sisyphean push up a hill until the rock crushes us for the last time?

I took a stab at framing my thoughts on why we are what we are some years back, and, of course, I framed it as an absurdist piece called “Seawater Sack Guy Speaks.” If pressed about the article and what it says or means, or why I wrote it, I’ll usually frame it as something more akin to the absurd whimsy of youth, ha ha ha, but if I’m honest here, it’s really a bit more than that, and there’s more objective truth about what I believe, or what I will have believed (credidero) within it than there are in most of my absurd writings. It begins thusly . . .

There’s an explanation for why we exist in the form we do, and I know what it is.

We are all about moving little pieces of the ocean from one place to the other. That’s all we are: sacks of seawater that can convert solar energy into locomotive force, so that we can move our little pieces of the ocean around. Unlike most seawater sacks, though, we are conscious of our selves, and this consciousness leads us to question our primary universal role as movers of hydrogen, oxygen, salts and minerals.

Consciousness is an electrochemical process that our particular strain of seawater sacks have evolved. No better or worse or different than a tail, a gall bladder, or an appendix. Because we don’t understand how this electrochemical process works, we use the very same electrochemical process to create mystical, non-biological explanations for its workings.

And it ends with this . . .

I’m not going to be carrying any metaphysical seawater around any metaphysical heaven or hell when my sack breaks down and releases all its atoms, so I figure I should use every bit of the consciousness I’ve evolved, here and now, to enjoy my fleeting, warm, moist moment in the Sun. This is not to say that I’ve a problem with other sacks of seawater whose enjoyment of their own fleeting, warm, moist moments in the Sun involves the belief in something different. If such chemical processes provide them joy or comfort (or at least the chemical processes that cause their seawater to produce such sensations), then such is their right, and who am I to force my chemistry upon them?

I take joy and comfort from just being conscious, and consider that scientifically miraculous enough.

Is that absurd? Yes. Is it a “good” or the “bad” manifestation of absurdity? I think the former, but I know some would say that if I shared it with a child, I’d inflict harm, and some would say that walking around as an adult thinking such thoughts could readily slot me into the pathological spectrum of absurd beliefs and behaviors. And they may be right. I am absurd, I admit it, inside and out — but I am not a philosophical absurdist. I do believe we can glean meaning and value in an unfeeling, unthinking, and unknowing universe. And I do not believe that a fundamental conflict between the quest for meaning and the universe’s indifference to it drives my own inner absurdity.

When I start thinking about these Credidero articles each month, one of the first things I do is to look at the etymology of the word to be considered. “Absurdity” surprised me in its roots: it is a Late Middle English word derived from the Latin absurdum, meaning “out of tune.” That elicited a “huh!” moment from me, as I am also actively, eagerly drawn to “out of tune” music: the first time I ever read about Arnold Schoenberg’s dissonant 12-tone music, I had to hear it; the first time I ever read about the tritone (“The Devil’s Interval”), I had to find a piano so I could play it; my listening library of thousands of songs contains a high percentage of music in which standard, pleasing Western melodic structures are in short supply. I didn’t realize it, but apparently my musical tastes are absurd too. At least I am consistent.

When I considered the concept of internal and external absurdity as a form of musical expression, I was immediately reminded of a wonderful, favorite song by Karl Bartos (ex-Kraftwerk), called “The Tuning of the World.” In it, Bartos writes about wishing that he could believe in God after seeing a haunting Laurie Anderson concert, noting:

I connect to the sound inside my mind
Closer I can‘t get to the divine
I wish I could believe in God
Life would be just safe and sound
I‘d build my house on solid ground
It‘s rather hard to understand
Why some believe and others can‘t
Who rules the tuning of the world?

I don’t know the answer to Karl’s final question there, for Karl, but to whoever rules the tuning of my own world, I am thankful that you left things in a state of wonky intonation with a lot of busted keys and clammed notes and buzzing frets, since I honestly like it better that way, absurdly enough.

Note: This article is part of an on-going twelve-part writing project. I’m using a random dice roller to select a monthly topic from a series of twelve pre-selected themes. With this fourth article complete, I roll the dice again . . .

12die

. . . and next month I will consider Topic Number Twelve: “Inhumanity.”

Caution: This book may detune your world.

 

All Articles In This Series:

Credidero: A Writing Project

Credidero #1: Hostility

Credidero #2: Curiosity

Credidero #3: Security

Credidero #4: Absurdity

Credidero #5: Inhumanity

Credidero #6: Creativity

Credidero #7: Community

Credidero #8: Complexity