Credidero #11: Mortality

I have written ~32,000 words to date as part of my planned 2019 writing project, Credidero, including an introductory article followed by ten pieces in ten months reflecting on ten topics. Some of my regular readers may have gotten through all ten of them. Many probably haven’t. Without checking the data, I suspect that the ones with the highest readership levels were those that covered the more common, basic aspects of  shared life experiences, e.g. community, security, or authority. Pretty much everybody will have to think about these topics at some point, some of them fairly regularly, and many of those folks might be inclined to see what someone else thinks about them. On the flip side, I expect the more esoteric topics — say, absurdity, complexity, or inhumanity — would have drawn lower readership levels, simply because not everybody has a need to consider such concepts regularly. So why bother investing any time in my thoughts on them?

If I’m correct in this assessment, then this month’s Credidero article — covering mortality — should be the most widely read of them all, since it’s the only topic of the twelve I’m covering that every single human being who has ever lived, is living, and ever will live, has or is going to experience. Some of us will face our own mortality sooner, some later, some suddenly, some after terrible lingering illness, some surrounded by loved ones, some alone, some welcoming the final curtain with a graceful bow, some raging against the dying of the light. We all experience birth (thought none of us remember it, so we can’t reflect on it), and we all experience death. In between those points, the only things that we all will share are breathing, eating, drinking, excreting, sleeping, and aging. When any of those activities stop, we die. Everything else is noise, on some plane. Or vanity, to cite the more eloquent words of the Preacher, the Son of David:I have seen everything that is done under the sun, and behold, all is vanity and a striving after wind.”

King Solomon went on in his Old Testament Book of Ecclesiastes to lay out a rubric designed to give meaning to our experiences between birth and death, beyond the basics of biological function. His crowning instruction was “Fear God and keep his commandments, for this is the whole duty of man,” but before that, he exhorted us to (among other things) enjoy life with the ones we love, seek wisdom instead of folly, cast our bread upon the waters, share our riches with those less fortunate, etc. But even if we follow all of the Preacher’s instructions, eventually the silver cord is snapped, or the golden bowl is broken, or the pitcher is shattered at the fountain, or the wheel broken at the cistern, and the dust returns to the earth as it was, and the spirit returns to God who gave it,” and forever [the dead] have no more share in all that is done under the sun.”

I cite Ecclesiastes here simply because it’s a relevant text in the faith tradition in which I was raised, so I’m most familiar with it, but I just as easily could have picked just about any culture in the world, ever, and found texts related to death and dying, and how to prepare ourselves for that imminent eventuality. The universality of mortality means that death must be among the most discussed and debated topics in human experience, and as each of us wrestle with its inevitability individually, so too do we seek to find shared senses of meanings about it, through practices designed to postpone and/or mitigate our fear of death, through rituals related to the disposition of our bodies, and through spiritual traditions designed to inspire or frighten us as to what we might experience after our final exhalations.

In considering mortality for this month’s article, I kept returning to the fact that there are really two aspects to evaluate: beliefs about what happens before we die, and beliefs about what happens after we die. Most faith-based and spiritual traditions put heavy focus on the latter, presuming that we all possess some unseen living essence that will survive the death of our bodies, and will either be reborn in some form under the sun again, or experience eternal paradise or endless damnation in some non-corporeal world. Typically, such spiritual traditions also provide rules for living our physical lives that are designed to heighten the probabilities of positive outcomes for our posthumous infinities. Some focus on litanies of sins to be avoided, some focus on lists of good deeds to be done. But in either case, all of our experiences, and all of our relationships, and all of our accomplishments in our brief (cosmically speaking) physical lives are ultimately just ticks on a tote board, elements of grades to be assigned in a final judgment, precursors to a metaphysical life that’s considered to be of infinitely more worth and value than our mean slogs through the mud of measurable human experience

As it turns out for the purposes of this series, I reflected on and wrote about my own beliefs regarding metaphysical life after death in an earlier Credidero article on eternity, so I won’t revisit them in detail this month. Suffice to say, I believe that when I die, I will not experience any lasting metaphysical consciousness or existence in any way that is identifiable as me, or by me. I will leave behind physical remains, of course, and I’ve left instructions that they should be burned, and my ashes be kept or disposed of as my surviving loved ones see fit. While I have always enjoyed visiting graveyards and cemeteries, I don’t wish to have any permanent marker placed with my name upon it when my time comes. While it won’t be my call, if asked now to identify a place where my remains most sensibly belonged for ceremonial reasons, I’d pick Stoney Creek Cemetery in South Carolina (some images and stories about it here). There are plenty of fire ant nests there, and in my sense of the perverse, I would find it apt for them to spread the little bits of me around the marsh over a period of months or years, the better to sustain whatever living things might find my constituent elements useful.

Neither of my parents will be there, though, should Stoney Creek actually be the final resting place of my scattered remains. My father is embalmed and buried at Beaufort National Cemetery (if you visit that Wikipedia link, I took the photo at the top of the page; my dad’s grave is just to the left of and below the huge live oak in the center of the shot), and my mother has directed that she wishes to join him there, an intention that I will honor as the executor of her estate. Neither of them were comfortable with the concept of cremation, and both of them place(d) high value on their remains being together in a dedicated location specifically managed as a memorial resting place for those who served in the armed forces and the spouses who sustained and supported them. So be it. I’ll honor those wishes. And I’ll likely continue to visit the cemetery and keep the graves clean and pause for moments of reflection. As one does. All good.

That being said, I’ve still never emotionally embraced the logic behind preserving a body with chemicals, putting it in an impervious (and expensive) metal box with fine decorations outside and within, then burying it all in the ground — especially in cases where the deceased believed that they are going on to some greater glory where their meat container is as meaningless as a shucked cocoon. Why preserve it? Why look at it before we close the box? Why keep it from the bugs and the plants that could make use of it? It seems most odd to me that we put such expense and effort into disposing of our bodies, beyond taking the most simple and effective steps to ensuring that our remains do not create health hazards or aesthetic displeasure to those who survive us. I suppose in the case of cultures like ancient Egypt where the Pharaohs believed that they’d need all of their corporeal bits in the afterlife it made some sense to keep things from decay, or if we expected to lie in state under glass, Lenin-style, for a couple of centuries. But within the precepts of most modern monotheistic religions that clearly describe a living spirit existing independent of its former body, it seems a largely meaningless excess and indulgence that preys on the emotions of the bereaved and plays into the funerary industry’s profits. But I know mine is a minority opinion.

(For a less jaded view on how our modern American funerary culture arose, I highly recommend reading Drew Gilpin Faust’s This Republic of Suffering: Death and the American Civil War. Many of the elements of contemporary death rituals in this country, secular and spiritual, trace their histories to the ways in which our Nation endured and moved on from the carnage of its greatest historic convulsion. I specifically noted above that I do not emotionally embrace the modern practices of managing the death process, because I do intellectually understand how they came to be, and why people want them that way. It just doesn’t feel right to me).

So as I consider the posthumous elements of my own mortality, those are my basic beliefs regarding both the metaphysical and corporeal elements of what happens after I die. Which leads to the second element of the analytical dichotomy posited above: What are my beliefs about mortality before I die? Not to be facile, but the easiest way for me to answer that question on a macro basis is probably to post a picture of the tattoo on the back of my left calf:

Those words come from a song called “Amethyst Deceivers” by COIL, one of my all-time favorite musical groups, who were (the two core members are both dead) hugely influential to my creative and musical aesthetics. That quote, to me, means that I know death is coming, and that I should be mindful and respectful of that fact, and to the other living things that live with me, and will follow me, or even consume me, when I am gone. I’m a small organism in a big ecosystem, and all of us are doing what we do while we have the chance to do it, none of us any better or more worthy than any other. The sigil above the quote is the black sun, an alchemic symbol that represents the first stage of the magnum opus, illuminating the dissolution of the body, and the namesake of Harry and Caresse Crosby’s incredible Black Sun Press, many original editions from which I had the chance to research and work with in a prior professional engagement.

That sign recurs regularly in the COIL oeuvre (they were artists as well as musicians), including the lyrics of the song “Fire of The Mind,” (click the link to hear it), which I’ve suggested should be played at any memorial service held on my behalf. The song’s lyrics are as follows:

Does death come alone or with eager reinforcements?
Does death come alone or with eager reinforcements?
Death is centrifugal
Solar and logical
Decadent and symmetrical
Angels are mathematical
Angels are bestial
Man is the animal
Man is the animal
The blacker the sun
The darker the dawn
Flashes from the axis
Flashes from the axis
On the hummingway to the stars
Holy holy, holy holy, holy oh holy
Holy holy, holy holy, holy
Holy holy, holy holy, holy
Man is the animal
The blacker the suns
The darker the dawn

(As a related side note, for many years, I suggested the Velvet Underground’s “Black Angel’s Death Song” as my final musical elegy, though my feelings about Lou Reed evolved over the years to a point where it seems less fitting for me now than it once did. Still, that song’s lyrics, especially the last ones — Choose to choose, Choose to choose, Choose to go — speak to me, and I know its truly abrasive music would be terribly uncomfortable for the people being forced to listen to it in the stuffy confines of a church or funeral home, which appeals to me. Have I mentioned my sense of the perverse?)

So, is it morbid that I wear that COIL quote and image on my body, and will until my body is no more? I didn’t intend it to be so, and I don’t think that it is. The tattoo celebrates the memory of artists who moved me, it reminds me of my place on the planet, and it exhorts me to be respectful of even the least attractive denizens of our amazing living world, for even they have their places, and their roles. (The song “Amethyst Deceivers” also references crows, rooks, ravens, humans, and the toxic little mushrooms that give the song its title, also all things I like). Man is the animal, indeed. One of many. The COIL quote doesn’t make me think about death, it makes me think about life. It’s not telling me to dwell on the vultures (metaphoric or otherwise) that will consume me, but rather telling me to be in the moment, alive, now, mindful, and to acknowledge the vultures on the occasions when our paths cross, graciously.

For the third time in the Credidero series, I find myself returning to an old article of mine called “Seawater Sack Guy Speaks,” which I originally wrote as light parody or absurd satire, but which, as I get older, somehow moves closer to being a sincere manifesto of sorts, though it’s still a bit more extreme in places than my real views might be. The key quote relevant to the topic of mortality is this one:

I’m not going to be carrying any metaphysical seawater around any metaphysical heaven or hell when my sack breaks down and releases all its atoms, so I figure I should use every bit of the consciousness I’ve evolved, here and now, to enjoy my fleeting, warm, moist moment in the Sun. This is not to say that I’ve a problem with other sacks of seawater whose enjoyment of their own fleeting, warm, moist moments in the Sun involves the belief in something different. If such chemical processes provide them joy or comfort (or at least the chemical processes that cause their seawater to produce such sensations), then such is their right, and who am I to force my chemistry upon them?

I take joy and comfort from just being conscious, and consider that scientifically miraculous enough.

When I actively think about my own mortality, which truly isn’t very often, I usually end up thinking and feeling along the lines of that quote, rather than finding myself consumed with existential terror and despair. (I do recognize that this might change were I given three months to live, or were I a frail 95-year old). I don’t come out of any occasional reflections on my own mortality feeling like I must do anything and everything to push death as far away as I possibly can, but rather I come out thinking that, well, it could happen tomorrow, so I’d better do something I like doing today, and be happy doing it.

Sometimes that’s an active pursuit, sometimes it’s a passive one. I love adventure travel, as an example, but I can also have a really good day hanging out around the apartment, puttering, occasionally popping in to bother my wife with kisses and nonsense. It may not be an epic and memorable day, but it doesn’t mean it’s a bad one. If all goes well, I’ll be able to do it again tomorrow. And I’m good with that, I really am. As someone who has wrestled with anxiety, depression, addiction, chronic pain and/or neurological health issues through different significant chunks of my life, I have learned to appreciate every day that doesn’t hurt, mentally or physically. Damned if I’m going to create bad days when I don’t need to by dwelling on the inevitability of my death until I make myself unhappy.

I will admit as I was researching the topic of mortality that I felt like I should sort of think about it in ways that made me unhappy, or at least uncomfortable, but I just couldn’t really make myself do that on any meaningful emotional basis. Maybe I’m too shallow or unimaginative, or maybe I’ve just built such strong walls between my intellectual and emotional states that I can’t deploy the former to excite the latter. I found the concepts of mortality salience and its underlying terror management theory to be the most interesting new (to me) things I uncovered during my research, but they remained intellectually stimulating, not emotionally so. The Wikipedia summary of those related articles explains that:

Mortality salience engages the conflict that humans have to face both their instinct to avoid death completely, and their intellectual knowledge that avoiding death is ultimately futile. According to terror management theory, when human beings begin to contemplate their mortality and their vulnerability to death, feelings of terror emerge because of the simple fact that humans want to avoid their inevitable death. Mortality salience comes into effect, because humans contribute all of their actions to either avoiding death or distracting themselves from the contemplation of it. Thus, terror management theory asserts that almost all human activity is driven by the fear of death.

There’s boodles of academic and popular writing out there to back up this premise, but it rings hollow to me when I try to apply it to my own life experience. If there’s anything about the concept of mortality that does make my soul quake on occasion, it’s not pondering my own departure, but rather pondering the departures of those close to me. I don’t have a lot of deep personal connections in my life, but the ones I do have are titanic in their import to me. If I were to outlive them all, then the ratio of “hurt” vs “doesn’t hurt” days would probably change pretty dramatically for me.

Most couples who have been together as long as my wife and I have will pick up inside songs or phrases that speak to the nature of the relationship in casual, affectionate terms. One of ours is a song called “More Than The World” by FREEMAN (the band that Aaron Freeman, a.k.a. Gene Ween, established during a hiatus from his better-known act), which features these lines:

I can’t make it alone
I’m too dumb to be on my own
I’ve never been very strong
I love you more than the world

That would be me speaking to her, not the other way around. And while it’s been a long time since I’ve had to test the theory, the “too dumb to be on my own” line is probably still true, so I’m more frightened by that future than I am by the prospect of my own departure. I do recognize that works both ways: while I might not spend much time or energy dreading my own flight into nothingness, those who love me likely worry about and fear my departure as much as I fear theirs. That’s the main thing that motivates me to consider longevity in my actions, despite my temperamental proclivities to embody that old joke about the stereotypical redneck’s final words: “Hey, y’all . . . watch this!”

But once again, what else can we do in the face of the ways that mortality will impact us, sooner or later, except live life to the fullest while we still can do so? As trite or pat as that might sound as a concluding sentiment for this article, it’s what I have believed, do believe, and hope to always believe. While death is ultimately just the absence of life, living is not just the absence of dying. There are so many things, big and small, that give me joy, and that I want to do, that it seems short-sighted to dwell on the time when such joys and desires are going to be snuffed out.

I’ve honestly spent more time thinking about death and dying this month as a result of writing this article than I probably have in all of the years combined since the early grieving stages that followed my father’s death in 2002. And once I finish tidying up this article and hitting the “publish” button, I’m going to get right back to happily respecting the vultures and moving my seawater around and loving my wife (and daughter) more than the world, because I can, and it’s good to do so, no matter what tomorrow might bring.

Your plumage looks very nice today, Mister Vulture. Respect!

Note: This article is part of an ongoing twelve-part writing project. I’ve used a random online dice roller to select a monthly topic from a series of twelve pre-selected themes. With this eleventh article complete, I don’t need to roll the die again, since I know that December will be dedicated to that last remaining topic: “Possibility.”

All Articles In This Series:

Credidero: A Writing Project

Credidero #1: Hostility

Credidero #2: Curiosity

Credidero #3: Security

Credidero #4: Absurdity

Credidero #5: Inhumanity

Credidero #6: Creativity

Credidero #7: Community

Credidero #8: Complexity

Credidero #9: Eternity

Credidero #10: Authority

Credidero #11: Mortality

Credidero #10: Authority

Back in the mid-’90s, when I was writing for an alternative newsweekly, the features team was occasionally given a summer gang project called “How To.” Each of us were tasked with writing a piece explaining, somewhat obviously enough, how to do something at which we were (nominally) experienced and knowledgeable. Being a quirky and contrarian crew, most of us chose to explain how to do things that were of a marginal degree of usefulness to our readers, producing articles that were probably intended to be entertaining (to the authors, anyway, if not the readers) more than they were educational.

Over the course of a few years, I explained How To Write A Record Review, How To Get a Grant, How To Keep a Secret, How To Talk To a Sleeping Rock Star, and How To Be An Expert. The grant-writing one was nominally useful, objectively speaking, if you were a fundraising professional, and the record review one has long been used by a journalism professor in Texas as part of her syllabus, so I suppose that one was legitimately of some value, too. The Sleeping Rock Star one was me making lemonade out of lemons after I was given a “phoner” appointment to interview then-trending singer-songwriter Abra Moore (who was asleep when I called her), and the secrets one was a result of me leading a weird double life where I was a music critic by night and a contracting officer for a highly classified military program by day.

Of those five pieces, How To Be An Expert was the one that hewed most meaningfully to my own real experiences and beliefs, and I have returned to or referenced it regularly over the past 25+ years as a basic operating tenet in my professional life. It stems from some of the best professional advice I was ever given, very early in my post-college career, after a simple conversation with a supervisor/mentor that went like this:

“If you want to succeed here, or in any other job,” he said, “then you have to become an expert.”

I asked the obvious question: “An expert in what, sir?”

“It doesn’t matter. Just make yourself an expert in something, and when you’ve done that, you’ll be indispensable.”

 

I used the word “expert” in that article, because that’s what my boss said, but I just as easily could have used the word “authority,” because that’s the gist of what he was communicating to me: if people perceive you as an authority on any particular subject, then you are useful to them, and you’ll always have a place in the organization, so long as you maintain your position as the organization’s authority of record on that particular topic, or maybe on a variety of topics, if you’re really good at exploiting this concept.

When I first started contemplating this month’s Credidero article, this “be an expert” narrative sat the center of my reflections on “authority.” I’ve spent most of my professional career in positions where I’ve been held up as an (or even the) authority on an evolving and branching stream of topics, as my work has taken me through a somewhat dizzying array of professional disciplines. I am self-aware enough, though, to know that in each and every case where I’ve been accepted as an authority on a particular topic, it was very much an act of me claiming that role, more than it was an act of others bestowing it on me — because if you say something long enough, often enough, and confidently enough, then it becomes reality, or at least is perceived as reality, and there’s really no difference between those outcomes.

My skills at self-marketing have always played into this paradigm, on top of the cultural cues and biases that benefit me by virtue of who I am and what I look like: a tall, white, older male with a degree from a “big name” college, who’s a glib speaker and solid writer, and with the ability to quickly process, retain and regurgitate a dazzling stream of facts and opinions. As such, most people are culturally conditioned to accept whatever I write, say, or do, if I offer my words of expertise confidently and with, yes, authority. There have been many times in my career when I have not been the most-trained, or most-knowledgeable, or most-experienced person in a given room or sphere on a specific topic, but people have still turned to me as “the authority,” simply because I’ve carried and presented myself as such more effectively than those around me, using the cultural privileges that are bestowed upon people like me as part and parcel of our society.

Is that fair? No, not really. But I have used it to my advantage anyway, and (more importantly, I think) to the advantage of my employers and their causes. I do not believe that I have ever used perceptions of my own authority for negative or negligent purposes, or to advance a crooked or conflicted agenda, or to denigrate, demean or disempower others who might, in fact, have more expertise than I do. I’m good at sharing credit when it’s due and when I can. That ability to advance the causes of my organizations in an authoritative way that makes people feel like they are invested in and connected to those causes is high among the traits that I believe have most contributed to my professional success over the years.

While I may claim to be an authority or an expert earlier and more forcefully than others might under similar circumstances, I also believe that I have managed those positions in ways where most people are willing to accept and reflect that authority back at me, confident that I will use it wisely, even if it is still nascent. And I say “most people” most purposefully, because I know that there are certainly a subset of my work colleagues over the years who just thought that I was a really good bullshit artist. That’s okay, I guess. I probably was. And probably still am. It’s hard to tell the difference between being a doctor and playing the role of a doctor on television sometimes, as long as you’re not performing brain surgery. I know my limits.

The word “authority” has several subtle definitional aspects to it, and I’ve only been focusing thus far on one of them: “the power to influence others, especially because of one’s commanding manner or one’s recognized knowledge about something.” This form involves being an authority (where I am the subject noun) on a given subject, which is somewhat different from having authority, where the subject noun is a standalone external right, and not me personally. That form of authority is defined as: “the power or right to give orders, make decisions, and enforce obedience.” When it comes to that form, there’s no “be an expert” bullshit or cultural bias at play, because you either have it, or you do not, typically as a result of your position within an organization.

As the CEO of a variety of nonprofits over the years, I’ve had all sorts of authority when it comes to this second definition of the term. I have had the ability to negotiate and sign contracts, take out loans, pay bills, sign checks, hire people, fire people, award grants, buy things, sell things, and a myriad of other rights that are integral and essential to the positions I’ve held. In the nonprofit sector, the ultimate fiduciary responsibility for the corporation resides in the board of directors, who are also tasked with governance and with hiring and supervising their chief executive. After that, it falls on the chief executive to manage the organization within the mission and vision established by the board of directors and ideally embodied in a strategic plan. That means I’ve had a lot of latitude to do what I thought was the right thing to do for each of my organizations, and I had the authority to implement whatever ethical and legal tactics I deemed best to getting the job done effectively and efficiently.

My understanding and living of this form of authority is also highly influenced by some of my early professional training, in this case while still at the Naval Academy, where we learned the differences and distinctions between authority, responsibility, and accountability as part of the Leadership and Management Education and Training (LMET) curriculum. At the simplest level, authority is the ability to make a decision, responsibility is the  job we are tasked to do, and accountability is the way in which we answer for the work we’ve done. The balance between these three factors has an immense impact on how effectively one can function in the work environment.

For example, if an employee has a high level of responsibility, but little authority, then he or she will likely be heavily frustrated by having to seek continual approvals elsewhere while trying to achieve necessary tasks. If an employee has both high authority and high responsibility, but no accountability, then it becomes easy for him or her to just coast, knowing that there are no likely repercussions for not fulfilling expectations, and the organization will suffer as a result. On the flip side, if the accountability function is ratcheted up too high, then it becomes difficult for an employee to achieve his or her responsibilities, even with clear authority, because of the constant micro-managing attention to activities that should be free from continual oversight and evaluation. I’ve always used my LMET training in evaluating potential work situations, and then once engaged, I’ve done my best to create the proper balance between those three facets of management, for myself and for those entrusted to my supervision.

I’ve been fortunate in most of my professional roles to have identified or developed nonprofit boards that allowed me to build and maintain appropriate balance between professional authority, responsibility and accountability. But with my pending retirement from the salaried work world in a few weeks, this will change for me, as I will no longer possess authority (nor responsibility, nor accountability) as a function of the position that I hold within an organization, for the first time in well over 35 years. In most typical freelance or consulting roles, I’ll likely have defined responsibilities and accountability, sure, but not much positional authority. Which means that I will have to fall back more heavily on that first form of authority, which I can claim for myself as a function of what I know, what I can do, and how well I can communicate it. I’m okay with that, I think. I’ve proven over the years that I’m pretty good at positioning myself as an expert, and I’m also fairly adept at being accountable to myself when I need to be. (Pro tip: I’ve found that it’s helpful to publicly state intentions on this front, e.g. telling all of my readers here that I was going to write a 12-part series called “Credidero” last January made me more likely to actually do it this year. Ten down, two to go!)

A few other facets of meaning and belief emerged for me as I considered the concept of authority over the past month. The first came when I did my usual research into the etymology and history of the word to be studied for the month. “Authority” has its roots in the Latin auctor, meaning “originator” or “promoter,” and that root also produced the modern English word “author.” I like the concept that developing and claiming authority is an act undertaken by an author, in that we write our own narratives, and then (using another element of the ancient word), we must promote those narratives in order to bring them to meaningful fruition. I do this continually, in so many places and so many ways, here on this website and in my “real world” personal and professional lives. All we are is all we’ve been, so in theory, I should get ever better at this as I age, so long as I don’t ever lose the rampant curiosity that’s often the motive force and lubricant of my learning and communicating processes. We’ll see how that goes.

There was another interesting intermediate evolutionary meaning in the etymological history of this month’s Credidero word. In 13th/14th Century Old French, between the Latin auctor and the English authority, we find autorite, which was an “authoritative passage or statement, book or quotation that settles an argument, passage from Scripture; authoritative book; authoritative doctrine.” In this usage, authority wasn’t a particular person, nor a power held by said person, but rather an inhuman physical artifact that was deemed to embody decisive decision-making power. This reminds me of the most beautiful of the Gospels, which John the Evangelist opened by simply explaining that “In the beginning was the Word, and the Word was with God, and the Word was God.” While we read this metaphorically, obviously, the idea that written and spoken words may carry the purest essence of the divine within them has always been highly appealing to me.

A self-professed and self-proclaimed right of authority has more heft if the very words that anchor it are right, and true, and inspired as outward manifestations of inner truths, or local observations of universal realities. In this sense, standing as a personal authority, even without positional authority, may be a path along which or a vehicle through which legitimate and pure societal good may be promulgated and promoted. Words have immense power to foster change, if you use them wisely. I like to think this is what I’ve done in my work over the past three-plus decades, and I am hopeful that I will be able to continue to do so in the years that remain ahead of me.

But the dark flip side of this paradigm is embodied by another modern English word that derives from the Latin auctor: Authoritarian. It’s tragic and troubling to consider how relevant this word has become again in modern political practice and parlance, as weak and insecure national leaders at home and abroad expect unquestioned obedience, and act tyrannically when they do not receive it. I read an interesting interpretation of the etymology of this word, which likened it less to “authority” and more to “author,” as authoritarian leaders seek to be the masters of the fictional worlds that they create. Unfortunately, almost all of them also have positional authority, which allows them to leverage vast monetary, legislative and military machines toward their own nefarious ends. That way evil lies. And madness.

This tendency toward authoritarianism becomes all the more dismaying and tragic when leaders are propped up by corporate propaganda machines and other weak and insecure legislators who use their own positional authority to propagate their leaders’ hateful messages and paper over their childish and/or criminal behaviors, lest they rock the status quo that’s elevated them, Peter Principle style, to positions well above their apparent capabilities and capacities. I think most folks my age in the United States grew up perceiving authoritarianism as a dead or dying political system. I doubt that many of us would have imagined that we’d be close to living in it as we eyeballed our retirement years, and that the centuries-old system of checks and balances designed to protect us from it would fail for nakedly partisan political reasons. Here’s hoping that enough of us wake up and exercise the authority constitutionally bestowed upon us as voters in 2020 to turn this tide, before it sweeps us away into the type of future that dystopian science fiction writers favor.

While there’s no question that authoritarianism is a bad thing, and must be resisted by sane citizens of any state, I find it interesting how often people look through that same lens when considering any form of authority. If you go search Bartlett’s Familiar Quotations or any other similar online quote banks for the word “authority,” the vast majority of the quotes that search returns will be focused on questioning, disobeying, challenging, or dismantling authority. Now, this may be a function of the fact that the types of writers and thinkers whose quotes end up in Bartlett’s are more apt to be anti-establishment types than the average citizen, or it may just be that these sorts of “Fight the Power” epigrams are more memorable and inspirational than the “He loved Big Brother” ones are, hence their appearances in such anthologies and encyclopedias.

But I have mixed feelings about blindly conflating authoritarianism with authority, as I loathe the former, but am more than willing to accept the latter, if it’s properly earned or bestowed. To some extent, that may be a function of the fact that I’ve counted on my own authority time and time again in my professional life as a key tool to achieve the things I want to achieve, and I don’t feel that every act and every decision I’ve taken with the authority vested in, or claimed by, me should be subject to scrutiny, question or rebuttal. I give other authorities the same benefit of the doubt that I expect from other people in considering my own actions and activities. I hope that as I move into a phase of my life where my authority stems from who I am and what I do, rather than from what position I hold, that I’ll be able to still leverage such authority to achieve my desired ends. Which, hopefully, will not be authoritarian in tone or tactics.

As I read back over what I’ve written this month, I note that there are more subtle semantic dances than usual, as I seek to shoehorn “authority” into the “what I will have believed” rubric behind this Credidero series of articles. But I think that was a necessary approach to wrestling with a concept that has so many significant variables operating within closely-aligned, but not exact, definitional distinctions. When I look at the authorities around me, I value those who bring earned or acquired expertise more than I value those who are granted authority by their positions, but I still value those positional authorities, so long as they don’t become authoritarian. There’s a need for vigilance, surely, as we evaluate the various authorities that govern and shape our lives, but when all is said and done, there’s also a need for those authorities themselves, and I hope that I am able to continue authoring my own life story in a fashion that encourages others to look my way and say “Now there’s an expert. Let’s see where he’s going to take us . . . ”

When an eagle explains stuff to you, you listen . . .

Note: This article is part of an ongoing twelve-part writing project. I’m using a random online dice roller to select a monthly topic from a series of twelve pre-selected themes. With this tenth article complete, I roll the die again . . .

. . . and next month I will consider Topic Number Three: “Mortality.” Since there’s only one topic left after that, I also know that December will be dedicated to Topic Number Two: “Possibility.” I guess those are two heady concepts with which to wrap the project! 

All Articles In This Series:

Credidero: A Writing Project

Credidero #1: Hostility

Credidero #2: Curiosity

Credidero #3: Security

Credidero #4: Absurdity

Credidero #5: Inhumanity

Credidero #6: Creativity

Credidero #7: Community

Credidero #8: Complexity

Credidero #9: Eternity

Credidero #10: Authority

Credidero #11: Mortality

 

Credidero #9: Eternity

As I pondered this month’s Credidero topic over the past thirty days, it occurred to me fairly early on that there’s a “one of these things is not the like the other” facet to this particular concept, in that “Eternity” is the only one of the twelve topics that cannot be tangibly experienced by human beings in any way, because it does not actually exist in the natural world.

I could go take a walk right now and experience complexity, or hostility, or curiosity, or any of the other eight topics I’ve considered and written about before this one, but there’s no way for me to experience an infinite span of time — unless I put my absolute faith in the premise of eternal life after death, snuff myself, and evaluate never-ending time as a tree in Dante’s Forest of Suicides. Or, conversely, if I was unexpectedly squished by a bus, and all was well with my relationship with my personal Lord and Savior, Jesus Christ, at that moment, in which case I could be granted eternal bliss in the presence of the The LORD and all of His angels, world without end, amen, amen.

I certainly don’t intend to do self-harm in the name of research, and I hope that there’s not a bus grill in my immediate future, so those avenues for exploring the concept of endless time are not on the table at this point. And even if they were, do I believe that my incorporeal soul would tread one of those paths when my incredibly fleeting time as a sentient seawater sack plays out? No, not really. I’ve formally directed that my bodily remains be cremated when that time comes, and they’ll presumably be scattered somewhere (informally, I’ve suggested that they should be put in a fire ant nest at Stoney Creek Cemetery), so the closest thing to eternity that the constituent bits which once were me will likely experience is a slow dispersal of elements which will be reintegrated into other living things (most likely plants, or fungi), which will feed other living things, until such time as life is exterminated from our planet’s face, or the planet itself ceases to be. And even then, some of those bits may travel through interstellar space, landing who knows where, who knows when, until the universe itself collapses, leaving behind . . . something? Maybe?

That will take a long, long time, for sure, but not an eternity, in the normal use of that word. While the earliest moments of the universe are mind-bogglingly complex and confusing, and its final moments will likely mirror that incomprehensible chaos, time as human beings understand it will have started at one point, and ended at another, a finite (though immense) period, short of the infinity required to accurately capture the core concept of eternity. Scientifically and objectively speaking, the story arc of every other human being, and every other living thing, will be exactly the same on a macro basis, and even if we aggregate all of the life spans and all of the experiences of all of things that have ever creeped, crawled and croaked across our planet’s surface, we’d still come up with a time span that approached infinity, but never actually reached it.

Eternity is, therefore, a non-existent physical state in a non-metaphysical universe. And yet, it’s a cornerstone concept of most global faith traditions, where gods always have been and always will be, and human souls are presumed to endure over never-ending time spans, once they are sparked into being. (One of the quirky things about infinity is that a thing that has no beginning and no end exists for the the same amount of time as a thing that has a beginning, but no end). A logical corollary of such belief systems is that the periods of time when our souls are resident in their physical forms are essentially non-existent in the grand scheme of things, as ~80 years of corporeal life divided by an infinite number of life-after-death years equals zero, mathematically speaking. If we go to hell after death, then eternity is suffering, always. If there’s a paradise, then eternity is bliss, always. Everything that we are, and everything that we do, in our physical lives, condenses down to a single, timeless point, a toggle-switch in which the indeterminacy of forever is resolved into one of only two possible eternal states.

While I wouldn’t have understood or stated it quite that way, I can tell you that few concepts were more terrifying to me as a young person than this one, having been raised in an evangelical Christian household. The concept of The Rapture — when all believers, alive and dead, would rise to meet The LORD in glory — made eternity even more terrifying, as it could happen any time, and if it occurred during that one little moment of doubt, or that one little second after temptation had become sin, then I would be left behind to bear the tribulation, the Second Coming and the Last Judgment, after which eternal damnation or eternal salvation awaited. All I knew as a young person was that if I had been bad, I could wake up one morning to find that my parents and all of the “good” people in my life were gone. In theory, that should have helped me to behave. In practice, I sinned with great aplomb, and was just scared all of the time that I wouldn’t be quick or thorough enough in my prayers for forgiveness to dodge that incoming Rapture bullet.

This was real enough in my world that I can remember having deadly earnest conversations with friends in middle school church youth groups about what we would do if didn’t make the cut when the Rapture came: where we would meet, how we would hide, what we would do, when finally faced with the undeniable reality of eternity, to ensure that we made the next cut together, and weren’t cast into eternal darkness and suffering. We saw it as some sort of post-apocalyptic action movie scenario, where we’d live on the run, protecting our little community at all costs from the Beast, and the Whore, and the Antichrist and their minions, faithful in our hidden catacomb headquarters, desperately repentant that we didn’t get it right the first time, determined to make amends if only given one more chance. And we had those conversations, more than once, because we all knew that we were woefully inadequate in our abilities to maintain sin-free, fully faithful lives, 24/7/365, so that the odds were stacked against us that we might all be right, true, and squared up in our faith at the precise moment when the virtuous souls began ascending. None of us pondered eternity with any expectation that it would be a positive experience, at bottom line. At least not without a whole lot of suffering before we got there, anyway.

So that’s what “eternity” meant to me through a good chunk of my formative years, a fraught concept fully anchored in an arcane belief system, and not in any observable reality — but terrifying nonetheless. That fear has abated over the ensuing decades, thankfully, and when I ponder the definition of eternity as “infinite time” now as an adult, I find that I can only perceive it at arm’s length, far more so than I can with any of the other Credidero concepts, as it has no meaningful impact or import in how I live my daily life and interact with other human beings. If I have any adult fears related to the concept, they spring from the knowledge that there are a shockingly large number of death cult zealots in positions of national leadership who are actively fomenting unrest in the Middle East in a misguided effort to hasten Armageddon and bring on the end times described by John the Revelator. I suppose eternity isn’t as frightening to them as it was to my young self, so secure are they in their faithful infallibility in the face of some final judgment. Must be nice.

Interestingly enough, the generally accepted definition of eternity as “infinite time” is (in relative terms) somewhat recent, having emerged only in the late Sixteenth Century. The ancient roots of the word are (possibly) found in the reconstructed Proto-Indo-European language’s aiw, meaning “vital [life] force.” From there we pass through the Latin aevum (age), aeviternus (great age), and aeternus (enduring). That latter form morphed into eternité in Old French, and thence into eternity in Late Middle English. The concept certainly captured long time spans over the aeons, if not infinite ones. There is also a specific philosophical usage where the word “eternity” means “outside of time,” as opposed to “sempiternity,” which is used to describe objects or concepts that exist now, and will continue to do so forever.

The crux of any discussion of eternity’s nuances, therefore, really hinges on whether the word is being used to describe very, very long time spans (which exist in our material world), or infinite ones (which do not). Which begs a second level question: does anything infinite really exist in the observable world? If there is no infinite time, is there an infinite distance, or an infinite mass, or an infinite number of some particular object(s), or anything else that has no beginning and no end when we attempt to count or measure it? Or even anything else that has no beginning and no end and exists somewhere else in the material world beyond our view or understanding?

I’m probably going to create a vision of myself as a most terribly neurotic child by sharing this, but I have to admit that “infinity” was another concept that kept me up at night as a young person, some years before fear of eternal damnation moved to the forefront of my existential anxieties. As a child of the ’60s, I was deeply fascinated by space exploration, and read voraciously about the topic. Our understanding of the solar system was a bit simpler then, with nine planets, and a readily countable and nameable number of natural satellites, plus some junk in the asteroid belt between Mars and Jupiter. Beyond Pluto, there was Deep Space, which went on (we presumed) forever. I have specific memories of laying in bed thinking about that: I’d fly my mental space ship to Pluto, and then go further. And then further. And then further. And there would still be further to go. I could make myself woozy if I kept at it long enough, trying to comprehend space with no edge and no end. (Honestly, I could probably make myself woozy today if I thought too long about what’s out there 13.7 billion light years away from the center of the universe, at the very leading edge of the Big Bang’s reach; it’s just as mind-numbing to ponder now as it was then, if less scary).

Despite its questionable existence in the real world of tangible human experience (or our questionable ability to perceive it), infinity is a readily accessible, and useful, concept in higher mathematics, which fascinated me to no end when I was studying advanced calculus and differential equations in college. The key kluge to tangibly dealing with infinity is captured in the concept of mathematical limits, where the value of a function (or sequence) approaches some limit as the input (or index) approaches some other value. So we can say that the limit is zero as an input approaches infinity, or we can say that the limit is infinity as we approach zero, or any number of other possible permutations that can be framed by various formulae and equations. We can’t actually get to infinity, but we can understand what happens as we approach it, in perhaps simpler terms. We can also accept that anything divided by infinity is zero — but not that anything divided by zero is infinity. (I’ve seen various explanations and proofs of that concept over the years, and I accept them, though there’s still some sense of logical incongruity there for the casual mathematician).

My math studies in college were one place where contemplating the infinite, the imaginary, and the irrational — and the ways in which they can modeled — was actually a positive, pleasurable experience. One of the most sublime intellectual moments of my life was seeing the derivation and proof of Euler’s identity:

“π,” as most know, is the ratio of the circumference to its diameter. It is an irrational number (e.g. it cannot be written as a fraction), and to the best of our knowledge, it continues irrationally infinitely; it has currently been calculated out to 31.4 trillion digits, and it never repeats in any predictable or discernible fashion. “e” is Euler’s Number, the base of natural logarithms. It has been calculated out to about 8 trillion digits, as best I can ascertain, also continuing irrationally in perpetuity. “i” is the imaginary number unit, which is the square root of -1. It cannot be calculated as it does not exist in the set of real numbers, but it’s a cornerstone concept in complex number theory. “0” is of course, zero, the opposite of infinity, and 1 is the first non-zero natural number, and the first in the infinite sequence of natural numbers. The fact that these five numbers — discovered and/or calculated and/or understood in different times, different ways, and different places throughout history — are provably related in such an ultimately simple and elegant way still utterly blows my mind with wonder and awe, both at the natural order that produces such relationships, and at the human powers of observation that divined and codified it. 

Those mathematical studies also inspired and spilled over into my creative life at the time. Around 1983, I wrote a song called “Anathematics” (there’s a demo version of it here), which included these lyrics, among others:

There’s a school of thought that is so large, it can’t be learned by one.
Six hundred monks are studying it now, but they have just begun.
The more they think, the less they know. They less they know, they’re not.
The more they’re not, the less I am. There’s more to me, I thought.
The limit is zero as we approach infinity.
The future’s uncertain, as only the past can’t be.
Anathematics explains what cannot be . . .

It’s less elegant than Euler’s Identity, certainly, but it was an attempt to try to capture the awesome confusion of the infinitely big and the infinitely small and the ways in which they overlap, taken from the viewpoint of modeling that which cannot be, rather than that which can. So essentially a poetic (and much shorter) version of what I’m doing here in this article, with a stiff beat that you most certainly cannot dance to.

There’s another way, in my life right here and right now, that I find myself reflecting on the limits of eternal time and eternal distance. My wife, daughter, and I all have the Drake Equation tattooed on our right forearms. Here it is, if you’re unfamiliar with it, along with an explanation of the terms embedded within it:

The Drake Equation was written in 1961 by Dr Frank Drake as a probabilistic argument to estimate the number of active, communicative extraterrestrial civilizations in the Milky Way. We know a lot more about some of the variables today than we did when Drake postulated this argument (e.g. rate of star formation, fraction of stars with planets, etc.), but for most of the variables related to life, we’re obviously still operating with an observable set of one species on one planet with the ability to cast electromagnetic signals outward to the stars, and we haven’t been doing it for very long, at all.

“L” in some ways is the most interesting variable to me, since we have no idea how long we’re going to be able to keep broadcasting before we destroy ourselves, or something else destroys us. I suspect in the grand scheme of things, it’s likely going to end up being a relatively small number. Imagine, though, if L for human and other civilizations was vastly large, approaching eternal, meaning that once a planet began broadcasting, it would broadcast forever, or at least until the collapse of the universe. I believe that were that the case, we’d be picking up myriad signals from across the galaxy, since I also believe that we are not the first planetary civilization to develop broadcast capabilities since the Milky Way emerged some 13.5 billion years ago. (Compare that to the current estimated age of the universe at 13.7 billion years . . . our galaxy was born about as early as it was physically possible for it to, if our understanding of those ancient events is accurate. Wow!)

Given the immense distances at play, I’m not sure that we’d ever actually meet any of the other civilizations, but it would be transformative for humans on a planetary basis to know that we’re not alone, rather than simply believing it. It would also be truly revelatory to know that our sentient non-human colleagues in our universe are not metaphysical in nature (e.g angels, demons, gods and goddesses), but exist instead in the knowable, experiential world of real things. I’m not a dewy-eyed optimist about how that knowledge would instantly make everything better on earth (we’d likely still be prone to inhumanity in our dealings with others of our species), but it would certainly answer a lot of big questions, and it would certainly present some big opportunities.

After we got the Drake Equation tattoos, my wife summarized what she thinks when she looks at hers thusly: “It reminds me that we are small, but special.” True that, for sure, for now. Given the fact that a longer “L” for humanity means we would have a higher probability of eventually demonstrating that “N” is greater than 1, I’d be most inclined to adopt and hew to a belief structure and practice that’s anchored in managing our lives, our cultures, our civilizations and our planet in ways that increase the likelihood of extending “L” for as long as humanly possible. It seems to me that a belief in and commitment to the tangible (though as yet indeterminate) time span “L” is of greater utility than being afraid of and/or longing for a metaphysical eternity and what it might (though probably doesn’t) represent and contain.

So is anybody up for starting The Church of Maximum “L,” with a defining core belief that “N” is greater than one, if we can only stick around long enough to establish contact and connect? I’d be a darned good early apostle if you need one.

Two-thirds of the family’s Drake Equation tattoos, freshly inked . . .

Note: This article is part of an ongoing twelve-part writing project. I’m using a random online dice roller to select a monthly topic from a series of twelve pre-selected themes. With this ninth article complete, I roll the die again . . .

. . . and next month I will consider Topic Number Five: “Authority”

All Articles In This Series:

Credidero: A Writing Project

Credidero #1: Hostility

Credidero #2: Curiosity

Credidero #3: Security

Credidero #4: Absurdity

Credidero #5: Inhumanity

Credidero #6: Creativity

Credidero #7: Community

Credidero #8: Complexity

Credidero #9: Eternity

Credidero #10: Authority

Credidero #11: Mortality

 

On Community

Note: Here is my “Leading Thoughts” column from the September 2019 edition of TREE Press, the monthly gazette of TREE Fund. You can read the latest and back editions, and subscribe to future installments, by clicking here. This article was adapted from a much longer piece written earlier this year, and available on my website here, for those who are interested in reading more about my views on “community.”

If you were to create a word cloud of every document, article, letter, and email I’ve written during my four-plus years as President and CEO of TREE Fund, I suspect that after the obvious mission-related words — tree, forest, research, endowment, education, arborist, etc. —  the word that would show up most frequently would be “community.” I use it all the time, referring to the Tour des Trees as our primary community engagement event, discussing how our work helps the global tree care community, noting that our work focuses on the importance of urban and community forests by promoting research designed to benefit whole communities of trees and related organisms (including humans), rather than individual specimens or species.

If you ran that same word cloud for the four years before I arrived at TREE Fund, I suspect you would not see “community” ranked so highly in our communications. We used to refer to the Tour des Trees as our primary fundraising event, and we discussed how our work benefited the tree care industry, and how our efforts advanced arboriculture, with much of our research focused on individual plant response, rather than forests as a whole. This change in language was not necessarily an organizational shift driven by some strategic planning decision, nor was it a modification to what we do and how we do it directed by our Board or emergent outside forces. It was frankly just me shaping the narrative about the organization I lead, and I how I want it to be perceived.

Calling the Tour des Trees just a “fundraising event,” for example, misses the critical component of how we interact with people as we roll on our way throughout the week, providing education and outreach to help people understand our work and how it benefits them. Saying that we work only for the “tree care industry” seems somehow antiseptic to me, implying that the businesses are more important than the community of people they employ, who collectively engage in the hands-on work of caring for trees. “Urban and community forests” is a helpful rubric in expressing the full scope of our focus, evoking and including big city park spaces, street trees, yard trees and trees along utility rights of way in suburbs, exurbs, and rural spaces. And thinking more about communities of trees, rather than individual plants, helps us better understand and communicate the exciting, emergent science exploring the ways that trees have evolved as communal organisms, and not just as disconnected individuals.

I think my focus on the word “community” is indicative of its deep importance to me, personally and professionally. My desire over the past four years, and hopefully into the future, is that TREE Fund acts and is perceived as part of something bigger and more connected than our relatively small physical, financial and personnel structure might otherwise dictate. I have been awed, truly, by the immense generosity, enthusiasm, wisdom and diligence of the global tree care community, and it has been an honor for me to be a small member of that great collective body, which works wonders, and makes a difference.

Getting ready to rejoin this great community of tree-loving cyclists again this weekend. You can click the photo if you want to make a last minute Tour des Trees gift to support the cause!

Credidero #8: Complexity

The concepts of “complexity” and “divinity” seem to be inextricably interwoven in much of Western religious and cultural thought. One of the most famous renderings of this philosophical and teleological duality is “The Watchmaker Analogy,” which was explored at length in English clergyman William Paley’s 1802 treatise Natural Theology or Evidences of the Existence and Attributes of the Deity. Paley opened his tome thusly:

In crossing a heath, suppose I pitched my foot against a stone, and were asked how the stone came to be there; I might possibly answer, that, for anything I knew to the contrary, it had lain there forever: nor would it perhaps be very easy to show the absurdity of this answer. But suppose I had found a watch upon the ground, and it should be inquired how the watch happened to be in that place; I should hardly think of the answer I had before given, that for anything I knew, the watch might have always been there . . . Every indication of contrivance, every manifestation of design, which existed in the watch, exists in the works of nature; with the difference, on the side of nature, of being greater or more, and that in a degree which exceeds all computation.

The gist of Paley’s argument boils down to the presumption that when you find a watch, there must be a watchmaker.  So, therefore, when you find a stone, there must also be a stonemaker. And then when you find a perfectly articulated shoulder joint, there must be a perfectly articulated shoulder joint-maker. And then when you find a flaming bag of poop, there must be a flaming bag of poop-maker. Well, okay, actually Reverend Paley didn’t mention that last one. It was just the anchor concept from a humorous collaborative piece I wrote many years ago, in which some colleagues and I envisioned a dialog between Charles Darwin (in Hell) and The LORD about Paley’s Watchmaker Analogy. It piqued my curiosity enough to explore it all those years ago, even if in a satirical form, and it was the very first thing that popped to my mind when I rolled the 12-sided die last month and had “Complexity” selected as the topic of this month’s Credidero article.

Charles Darwin himself also spent a fair amount of time thinking about The Watchmaker Analogy, well before he went to Hell, even. Darwin was aware of and fond of Paley’s work, and scholars have theorized, with clear reason and reasoning, that Darwin’s explanations of natural selection in On The Origin of Species are actually framed and intended as respectful scientific counter-arguments to those made in Natural Theology.  Even Richard Dawkins, the high priest of neo-atheists and father of all memes, evokes Paley in the title of his influential 1986 tome The Blind Watchmaker. The good Reverend’s final book remains in print, and is a cornerstone text in modern “intelligent design” circles. Those are sure some long and limber legs for such a nominally arcane older text.

Given his longstanding popularity and cultural resonance, if you want to frame arguments for or against complexity as a function of a divine creator, Paley’s as good of a starting point as you’re likely to find. Unless, of course, you’re too much of a fundamentalist to see his work as anything more than a derivative text, and you just want to jump straight to the opening lines of the primary text upon which all of Western (and by Western, I mean American) religious culture has been erected:  “And the Earth was without form, and void; and darkness was upon the face of the deep. And the spirit of The LORD moved upon the face of the waters. And The LORD said, let there be light: and there was light,” sayeth the Book of Genesis, which many Evangelical types interpreteth as the literal Word of The Lord, their God and Savior. A few verses later, as we mostly all know, The LORD went on to make day and night, and stars and sky, and land and seas, and the sun and the moon, and plants and animals, and mankind and naps, with each day’s creations more complex than the ones that came before.

As the very first appearance of The LORD in The Bible highlights His ability to create complexity where none existed before, that seems to be the professional trait of which He (or his public relations team) is most proud, and He continues to conjure up something from nothing (stepping up complexity every time) throughout the Old Testament, in between all the smiting and the flooding and the worrying about what women are up to with their bodies that He so seems to enjoy in His spare time. Then later, when The LORD’s only begotten Son decides to unveil his own formidable chops as proof of his divinity in the New Testament, He does it by creating wine from water at a wedding party, simply by adding that magically divine special ingredient: complexity. Bro-heem Christ could have just ended his career right there and still been a legend.

The underlying viewpoint that increasing complexity requires some force greater than that which mere humans can muster isn’t restricted to matters of natural science. Case in point: Erich Von Däniken’s 1968 book Chariots of the Gods? Unsolved Mysteries of the Past, and the many sequels and imitators in print and on screen that followed it. The central argument of these tomes is that the design and construction skills behind ancient objects like the Great Pyramids or Nazca Lines or Stonehenge or the Easter Island statues were complex beyond human capabilities at the time, and therefore must have required inhuman assistance, only in this case not from The LORD, but rather from super-intelligent extraterrestrial beings.

I must admit that I ate those books up as a kid, their logic seeming so very obvious and profound to my 10- to 12-year old mind. But I’d certainly raise my eyebrows and smirk these days at anybody over the age of 16 or so who cited them as part of their mature understanding of the world in which we live, just as I do with people who consider the works of Ayn Rand to be rational adult fare. If we can’t figure out how something complex was built or got done, it seems like intellectual defeatism to simply attribute it to super-powerful unseen entities — either divine or extraterrestrial or John Galt — rather than just working harder to figure it out, and then recognizing that humanity’s ability to create complex objects and artifacts does not necessarily proceed in a linear fashion.

We cannot build a Saturn V rocket today, to cite but one of many examples. That doesn’t mean that those epic machines were built by Jesus, or dropped on the launch pad at Cape Canaveral by Bug-Eyed Monsters. It just means that the industrial base required to build them doesn’t exist anymore as our Nation’s economic, political and military needs evolved. Which I strongly suspect is also the case with pretty much every one of Erich Von Däniken’s examples of the allegedly extra-human skills required to build all of those ancient wonders. Humans, then, now, and in the future, are capable of great complexity in our creations, and if we care enough about something  — putting a man on the man and bringing him home safely before the end of the decade, building a tomb that will last forever, or impressing the ladies on the other side of the island with the chunky heft and knee-melting girth of our massive stone heads — then we’ll work marvels large and small to get ‘er done.

So why do so many people default to the notion that immense complexity requires some form of divinity as its motive force? I suspect it is because the natural order of things is to reduce complexity — bodies to ashes to dirt to dust, cities to ruins to iron to rust — so when we see something, anything, pushing valiantly against the never-ending corruption of eternal entropy, we are temperamentally inclined to judge it special, and meaningful, and not just an anomalous series of natural processes organized in a particular way that slows or reduces or offsets entropic forces for some (likely brief) period of time. In the observable universe, entropy always wins in end, so if we want to believe that acts and examples of increasing complexity are permanent, then a belief in something outside of or beyond that which can be known and understood with our senses and intellects is a darn good way to avoid dwelling on the fact that we and our creative works are going to die and become dirt at some point in the very, very near future, speaking in cosmic time scales.

In looking at casual Christian theology and practice, I sort of like the next order of logical thought that tumbles from this one: if complexity is the realm of God the Watchmaker, then is its opposite, entropy, the particular expertise of Lucifer, His Enemy? The written record on Beelzebub is certainly rife with destruction, and fall, and spoilage, and violation, and war, and madness, and off the cuff, I’d be hard-pressed to think of examples where the Crimson King showed complex, creative chops like those the Father, Son and Holy Spirit trot out at the slightest provocation to bedazzle their admirers. God makes, the Devil breaks, and as long as The LORD stays one step ahead of his nemesis, order prevails.

But if The LORD spent too much time watching over one particularly needy tiny sparrow, would Old Scratch turn the tables on Him (and us) and pull apart the fabric of the known and knowable? I think that when the Beast and the False Prophet and the Dragon are finally cast into the Lake of Fire in the Apostle John’s Book of Revelation, what we’re seeing is actually a metaphorical depiction of the final removal of entropy from the world. I’m guessing that the New Heaven and New Earth and New Jerusalem were seen by Saint John on Patmos as the most fabulously complex constructs imaginable in his time, and that most readers of the Apocalypse since then also envision them in such terms, defined by the norms of whatever time and place that they are pondered. The LORD’s not gonna come live with His people in a humble casita or pre-fab double wide now, is He? Nuh uh. The buildings in that glorious end-of-times city are going to have flying buttresses upon their flying buttresses, and there might even be a Saturn V pad in every yard, plus unlimited pocket watches available upon demand.

I recognize, of course, that I’m being a bit silly here in my analysis of complexity as it’s defined by The Watchmaker Analogy, just as I was being a bit silly when I first wrote The Flaming Bag of Poop-Maker circa 2003-2004. And I guess that’s because whenever I think about that particular argument for the existence of a Supreme Being, it just seems so very obviously and inherently ridiculous to me that responding in kind is the only logical approach to tackling it. There are so many arguments for the existence of God, and so many of them seem more sound and embraceable to me than Reverend Paley’s. I suppose my opinion might be different if I actually thought that The LORD created the world over seven days, some 6,000 years ago (thank you very much, Archbishop Ussher), but given 4.5 billion years for our planet’s natural forces to do their things, with the universe as a whole having an 8.3 billion year head start on our stellar system, I’m not in the least bit surprised by magnitudes of complexity far beyond all human understanding, since we’ve only been collectively pondering such matters for (at most) about 0.2% of Earth’s natural history.

To be clear, though, this does not in any way mean that I do not marvel regularly at the complexity of creation, even if creation created itself. I’m truly and deeply awed by so many complex natural things, from the little creepy-crawly ones that I rescue when I see them on sidewalks to the immense ones light years and light years away from us that I gaze at in stupefaction during the (increasingly rare) times when I have an unobstructed view of a night sky free from light pollution. I’m amazed by the complexity of my own body (creaky as it is), and by the complexity of my own brain’s machinations (awake and asleep), and by the complexity of the sea of emotions in which I swim, loving this, ignoring that, hating the other. I’m well read, reasonably smart, and actively interested in understanding how things work, and I still can barely perceive the tiniest bits of what natural selection has wrought upon the living things around me, as we all hop atop a ridiculously complex ball of elements and minerals and fluids, all governed by forces strong and weak, gravitational and electromagnetic.

Really and truly, I don’t perceive natural complexity as proof of divinity, I perceive natural complexity as divinity in its own right. The complex and ever-evolving canons of chemistry, physics and biology are the closest things I’d admit to admiring as sacred primary texts. I could spend a lifetime studying them, and understand as much as my brain could possibly absorb, and still I would be awed beyond comprehension by the complexity of the natural order in which I function, for the very short, sweet, warm time that I’m blessed to be a self-regulating blob of motile biochemical materials, animated by a denser blob inside my beautifully complex upper bony structure, within which everything that is really, truly me resides, amazingly and incredibly distinct from all of the universe’s possible not-me’s.

At bottom line, I don’t need to worship a fanciful Watchmaker, because I am perfectly content to worship the Watch itself. And the stone next to the Watch on Reverend Paley’s heath. And the tiny dinosaurs that hop around the stone, cheep cheep cheep! And the moo-cow that might pass us all by, chewing a cud rich with uncountable organic oozes, as I talk to the Cheep Cheeps, wishing I had some sunflower seeds in my pocket for them. And the 4,000 miles of metal and stone between me and the Earth’s center as I look down, and the uncountably, immeasurably vast distances above me as I look up, gazing billions of years into the past, perceiving light from way back then as it arrives in the right now on its way to the yet to come. There’s nothing in the Book of Genesis that can rival that, if we’re going to fairly assess things. Nor in Atlas Shrugged.

And now I’ve swung from a most silly approach to assessing complexity to a most abstractly profound one, likely more than two standard deviations away at both ends of the spectrum from how normal people in normal times in normal places would perceive normal complexity. Whatever “normal complexity” might be, anyway. Perhaps that’s an oxymoron? Perhaps it’s a phrase that doesn’t normally exist because it doesn’t need to? Or perhaps it’s just a simple way to describe the chaotic world in which we live and work, driven by complex forces that we often do not see, recognize or appreciate?

I’m inclined to grab that third explanation/definition when thinking about human complexity in human-driven spaces. There’s lots of stuff that we collectively create swirling around us, and when I ponder that, I’m still most drawn to the most complex examples of it, most of the time. I like the Ramones well enough, to cite a musical example, but I adore the far-more-complex King Crimson. Likewise in my taste for visual arts, where extreme abstraction and deeply technical compositions move me far more than literal still lifes and figure studies. My list of top movies is also rife with multi-layered surrealist complexities, while I tend to forget simpler character-based rom coms hours after I watch them. Books? I’ll take the complex Gormenghast Trilogy and The Islanders and The Flounder over the simpler The Old Man and the Sea and Of Mice and Men and The Call of the Wild any day. Man-made creative complexity is good in my eyes. It resonates with me. It moves me. It inspires wonder in me. It represents the spaces where we become most Watchmakery, to return to Reverend Paley’s paradigm.

There’s one weird exception when it comes to my love for complexity, and that would be work. I’m a deep devotee of the “keep it simple, stupid” paradigm in the office, and if you interviewed anybody who’s ever worked for me over the past 30+ years, they’d likely cite my penchant for process streamlining and organizational simplification, and my loathing for clerical redundancy and structural inefficiency. When it comes to my professional work, less complex is more desirable, almost to a point of fetishism. I suppose this could be explained altruistically, with me taking the position that my time equates to my organizations’ money, so that deploying my own time and the other human resources around me most efficiently represents a truly ethical approach to stretching our resources as far as they may be stretched. But I think the honest reality is that I view work as a thing that has to be completed before I can play with the complex things that move me more deeply, so by taking the least moves possible to achieve desired professional outcomes, I can preserve the energy I need to take the most moves possible toward the complexities that most amuse and entertain and inspire me. “Wasting time on the man’s dime, yo!” There’s a professional creed to motivate the masses, for sure.

If simple work is the opposite of complex fun, just as entropy is the opposite of creation, just as the Devil is the opposite of the Watchmaker, then we’ve got to wrap back around to opening arguments and conclude by accepting that work must be the purview of Satan, and play must be the purview of God, and that we model ourselves most clearly in His image when we frolic in fields of phlox and fescue and philosophy and felicity and feeling and friends and family and festivity and fun.

I’m ultimately happy to embrace such a simple truth when staring into the awesome face of such a stupidly, gloriously complex universe as ours!

Step aside, simple ones! Complex Nazca Line Building Alien coming through!

Note: This article is part of an ongoing twelve-part writing project. I’m using a random online dice roller to select a monthly topic from a series of twelve pre-selected themes. With this eighth article complete, I roll the die again . . .

. . . and next month I will consider Topic Number Eleven: “Eternity”

All Articles In This Series:

Credidero: A Writing Project

Credidero #1: Hostility

Credidero #2: Curiosity

Credidero #3: Security

Credidero #4: Absurdity

Credidero #5: Inhumanity

Credidero #6: Creativity

Credidero #7: Community

Credidero #8: Complexity

Credidero #9: Eternity

Credidero #10: Authority

Credidero #11: Mortality

 

Credidero #7: Community

If you were to create a word cloud of every document, article, letter, and email I’ve written during my four years as President and CEO of TREE Fund, I suspect that after the obvious mission-related words — tree, forest, research, endowment, education, arborist, etc. —  the word that would show up most frequently would be “community.” I use it all the time, referring to the Tour des Trees as our primary community engagement event, discussing how our work helps the global tree care community, noting that our work focuses on the importance of urban and community forests, by promoting research designed to benefit whole communities of trees and related organisms (including humans), rather than individual specimens or species.

If you ran that same word cloud for the four years before I arrived at TREE Fund, you most likely would not see “community” ranked so highly in our communications. We used to refer to the Tour des Trees as our primary fundraising event, and we discussed how our work benefited the tree care industry, and how our efforts advanced arboriculture, with much of our research focused on individual plants, rather than their collectives. This change in language was not an organizational shift driven by some strategic planning decision, nor was it a modification to what we do and how we do it directed by our Board or emergent outside forces. It was frankly just me shaping the narrative about the organization I lead, and I how I want it to be perceived.

Calling the Tour des Trees a “fundraising event,” for example, misses the critical component of how we interact with people as we roll on our way throughout the week, providing education and outreach to help people understand our work and how it benefits them. Saying that we work for the “tree care industry” seems somehow crass and antiseptic to me, implying that the businesses are more important than the people who collectively engage in the hands-on work of caring for trees. “Urban forests” can be confusing to folks in its evocation of big city park spaces, even though street trees, yard trees and trees along utility rights of way in suburbs, exurbs, and rural spaces are also part of our mission’s purview. And thinking first of communities of trees, rather than individual plants, helps us better understand and communicate the exciting, emergent science exploring the ways that trees have evolved as communal organisms, sharing information and nutrients through root-based symbiotic networks.

I’d be fibbing if I said that I had purposefully made these and other related linguistic changes as part of an intentional, organized shift in tone. It just happened as I went along, and it honestly didn’t actively occur to me that I had done it in so many ways and places until I started thinking about this month’s Credidero article. But the changes are clearly there, evidence of the fact that it’s somehow deeply important to me, personally and professionally, that TREE Fund acts and is perceived as part of something bigger and more connected than our relatively small physical, financial and personnel structure might otherwise dictate. I do believe that words have power, and if you say something often enough, and loudly enough, that people begin to perceive it as true, and then it actually becomes true, even if nothing has really changed except the word(s) we use to describe ourselves and our activities.

So why is “community” such an important and transformative word in my personal worldview? As I normally do in these articles when thinking about questions like that one, I looked into the word’s etymology: it comes to us English-speakers via the Old French comuneté, which in turn came from the Latin communitas, which ultimately boils down to something “shared in common.” But there’s a deeper layer in the Latin root that’s preserved to this day in cultural anthropology, where communitas refers to (per Wiki) “an unstructured state in which all members of a community are equal allowing them to share a common experience, usually through a rite of passage.”

The interesting corollary here, of course, is that those who do not or cannot participate in that rite of passage may neither partake of nor receive the benefits of communitas. Peter Gabriel’s “Not One Of Us” has long been one of my favorite songs, both musically (Gabriel, Robert Fripp, John Giblin and Jerry Marotta put in some sublime performances here) and lyrically, with one line standing out to me as a key bit of deep wisdom, writ large in its simplicity: “How can we be in, if there is no outside?” That deepest root of the word “community” captures that sense of exclusion: there’s a positive sense of belonging for those who have crossed the threshold for inclusion, while those who haven’t done so are (to again quote Mister Gabriel) “not one of us.”

So are many (most?) communities perhaps defined not so much by who they include, but rather by who they exclude? I suspect that may be the case. When I first arrived at TREE Fund, for example, I had a couple of early encounters and experiences where folks communicated to me, explicitly and implicitly, that they saw TREE Fund not as a cooperative symbiote, but rather as predatory parasite, on the collective body of tree care professionals and their employers. I was also made to feel uncomfortable in a few situations by my lack of hands-on experience in professional tree care, including the fact that I had no certification, training, or credentialing as an arborist or an urban forester. I had not passed through the “rite of passage” that would have allowed me to partake of the tree peoples’ communitas, and so in the eyes of some members of that community I was (and probably still remain) on the outside, not the inside. So my push over the past four years for TREE Fund to be an integral part of a larger professional community may be, if I’m honest and self-reflective, as much about making me feel included as it is about advancing the organization.

When I look bigger and broader beyond TREE Fund, I certainly still see a lot of that “inside/outside” paradigm when it comes to the ways in which we collectively organize ourselves into communities, locally, regionally, nationally, and globally, oftentimes along increasingly “tribal” political lines, e.g. Blue States vs Red States, Republicans vs Democrats, Wealthy vs Poor, Christian vs Muslim vs Jew, Liberal vs Conservative, Citizen vs Immigrant, Brexit vs Remain, etc. Not only do we self-sort online and in our reading and viewing habits, but increasingly more and more people are choosing to live, work, date, marry, and socialize only within circles of self-mirroring “insiders,” ever more deeply affirming our sense that the “others” are not like us, are not part of our communities, and may in some ways be less important, less interesting, less deserving, or even less human than we are.

That’s certainly the narrative being spun by our President right now through social media, spoken statements, and policy initiatives, as he seems adamantly opposed to “an unstructured state in which all members of a community are equal.” Which is dismaying, given the allegedly self-evident truths we define and hold in our Nation’s organizational documents, ostensibly designed to bind us as a community under the leadership of a duly-elected Executive, who is supposed to represent us all. That said, of course, we know that the infrastructure of our great national experiment was flawed from its inception in the ways that it branded some people as less than fully human, and some people as not qualified to participate in the democratic process, due to their skin color or their gender. I’d obviously like to think that we’re past those problems, some 250 years on, but the daily headlines we’re bombarded with indicate otherwise. Insiders will always need outsiders . . . and communities may often only feel good about themselves by feeling bad toward those they exclude. I suppose several thousand years of history show that may well be a core part of what we are as human beings (I explored that theme more in the Inhumanity Credidero article), and that maybe aspiring to create positive communities of inclusion may be one of the nobler acts that we can pursue.

I’m stating the obvious in noting that the ways we can and do build community, for better or for worse, have radically changed over the past 25 years or so with the emergence of the world wide web and the transformations made possible by it. If you’d asked me to describe what “community” meant to me before 1993, when I first got online, I’d likely have focused on neighborhoods, or churches, or fraternal organizations or such like. I’d say that within less than a year of my first forays into the internet’s kooky series of tubes, though, I was already thinking of and using the word “community” to refer to folks I romped and stomped with online, most of whom I’d never met, nor ever would meet, “in real life.”

I wasn’t alone, as the word “community”  has became ever-more widely and casually used over the years to describe clusters of physically remote individuals interacting collectively online, via an ever-evolving spectrum of technological applications, from ARPANET to the World Wide Web, from bulletin boards to LISTSERVs, from mailing lists to MMORPGs, from blogs to tweets, and from Cyber-Yugoslavia to Six Degrees to Friendster to Orkut to Xanga to Myspace to LinkedIn to Facebook to Twitter to Instagram to whatever the next killer community-building app might be.  I actually wrote a piece about this topic ten years or so ago for the Chapel + Cultural Center‘s newsletter, and at the time I used the following themes and rubrics to frame what community meant to me:

  • An organized group of individuals;
  • Resident in a specific locality;
  • Interdependent and interacting within a particular environment;
  • Defined by social, religious, occupational, ethnic or other discrete considerations;
  • Sharing common interests;
  • Of common cultural or historical heritage;
  • Sharing governance, laws and values;
  • Perceived or perceiving itself as distinct in some way from the larger society in which it exists.

And I think I stand by that today, noting that a “specific locality” or “a particular environment” may be defined by virtual boundaries, rather than physical or geographical ones. But then other elements embedded within those defining traits raise more difficult questions and considerations, including (but not limited to):

  • What, exactly, is an individual in a world where identity is mutable? Is a lurker who never comments a member of a community? Is a sockpuppet a member of a community? Are anonymous posters members of a community? If a person plays in an online role-playing game as three different characters, is he one or three members of the community?
  • How are culture and historical heritage defined in a world where a six-month old post or product is considered ancient? Do technical platforms (e.g. WordPress vs. Twitter vs. Instagram, etc.) define culture? Does history outside of the online community count toward defining said community?
  • What constitutes shared governance online? Who elects or appoints those who govern, however loosely, and does it matter whether they are paid or not for their service to the group? What are their powers? Are those powers fairly and equitably enforced, and what are the ramifications and consequences when they are not? Is a virtual dictatorship a community?

I opined then, and I still believe, that there is a fundamental flaw with online communities in that virtual gatherings cannot fully replicate physical gatherings, as their impacts are limited to but two senses: sight and sound. While these two senses are clearly those most closely associated with “higher” intellectual function, learning and spirituality, the physical act of gathering or meeting in the flesh is much richer, as it combines those cerebral perceptive elements with the deeper, more primal, brain stem responses that we have to taste, touch and smell stimuli. While I’m sure that developers and designers and scientists are working to figure out ways to bring the other three senses into meaningful play in the digital world, a quarter century after I first got online, there’s been no meaningful public progress on that front, and I am not sure that I expect it in the next quarter century either.

Image resolution and visual interactivity get better and better (especially on the virtual reality front), while sound quality actually seems to get worse and worse over time, when we compare ear buds and “loudness war” mixes to the warm analog glory days of tube amps and box speakers — but that’s it, really. And as long as we are existing digitally in only two senses, exchanging messages online removes any ability to experience the physical reality of actually touching another person, be it through a hand-shake, a kiss, a squeeze of the arm or a pat on the back.  The nuances of facial expression and inflection are lost in e-mails and texts, often leading to confusion or alarm where none was required or intended. There is no ability to taste and feel the texture of the food we discuss in a chat room. It still seems to me that the physical act of community building is a visceral one that appeals to, and perhaps requires, all of our senses, not just those that can be compressed into two-dimensions on our computer screens.

I still believe that two-dimensional communities are, ultimately, destined to disappoint us sooner or later for precisely that reason. I have certainly spent countless interesting hours within them — but if you plotted a curve over time, my engagement grows smaller by the year. While people often compare the dawn of the Internet era to the dawn of the printing press era, it’s important to note that the earlier cataclysmic shift in the way that information was preserved and presented (from spoken word to widely-available printed material) did not result in the elimination of physical gatherings, upon which all of our innate senses of community have been defined and built for centuries, as has been the case in the Internet era. Communication happens more readily now, for sure, and communities may be built almost instantaneously, but they’re not likely to have all of the lasting resonances that their traditional in-person counterparts might offer.

I note, of course, that my feelings on this topic are no doubt influenced by the fact that my adulthood straddles the pre-Internet and post-Internet divide. I was raised without computers and cell phones and instantaneous access to pretty much anybody I wanted to connect with, anywhere in the world, so my communities couldn’t be accessed or engaged while sitting alone in my bedroom. I don’t know how many people have been born since 1993, but many (most?) of them, having been fully raised in the digital world, may not be wired (no pun intended) to feel that distinction. And when I continue to make that distinction, they likely see me in the ways that I once would have perceived a grouchy old man shaking his fist and shouting “Get off my lawn, you kids!”

Generational issues aside, I do think that some of the uglier aspects of online communities — bullying, hateful echo chambers, exploitation of weaker members, cruelty hidden behind anonymity — are blights on our culture and our souls, and are having direct cause-effect impacts on the nastiness of our modern social and political discourse. If Twitter and Facebook and other social media sites were shutdown tomorrow, a lot of online communities would cease to exist, sure, but the impact of that global loss of connection would not necessarily be a net negative one. But the genie’s out of the bottle on that front, I suppose, as barring a full-scale catastrophic failure of the global communication network, communities (ugly and beautiful alike) will just emerge in new virtual spaces, rather than those billions of people returning en masse to traditional, in-person community building.

But some of them might. And I might be one of them. I’ve written here before about being “longtime online” and often a very early adopter of new platforms and technologies as they’ve emerged. But somewhere in the past decade or so, I stopped making leaps forward in the ways that I communicate with people and engage with communities online. The next thrilling app just sort of stopped being any more thrilling than the one I was already using, so inertia won out. I bailed on Facebook around 2012, and have used Twitter almost exclusively to communicate online (outside of this blog) between then and last month, when I decided to let that go too.

Beyond social media, I have had several online forum-based communities in which I was very active over the years (Xnet2, Upstate Wasted/Ether [defunct], The Collider Board [defunct], The Fall Online Forum, etc.), and those have mostly fallen by the wayside as well. I’ve retained some very meaningful communications with some good friends from those spaces via email and occasional in-person meetings, but it’s one-on-one connection between us for the most part, and not dialog and group play unfolding in public before the rest of the community. And, again, I think I’m finding it easy to walk away from those public communities, for the most part, because the personal depth of the connections I’ve made gets shallower as the years go on, and even some of the long-term connections just sort of run their courses and stagnate, because there’s really no organic way for the relationships to grow or advance in any meaningful way.

Maybe again this is just a me-getting-older issue, but I get more richness of experience within my communities that exist in real space, and real time, than I used to, and I get less from my online connections. A desire to move more toward that probably played some psychological part in how hard I pushed the word “community” in my professional life, trying to build one there, not only through online means, but also through the scores of conferences that I’ve attended over the years, with tree care scientists and practitioners from around the world. That is a good community. I believe that I have improved TREE Fund’s standing within it. And that feels good.

Part of the cost of doing that, though, was really failing to become part of any meaningful real-world community where I actually lived in Chicago, and also being separated from the little community that means the most to me: my family. A big part of my decision to retire this year was the desire to get that priority inequity better aligned, and I think that as we look forward to our next move as a family, whenever and wherever it is, I’ll be more inclined to put the effort in to make new community connections there, rather than just hanging out on the computer chatting about arcane subjects with what Marcia fondly refers to as my “imaginary friends.”

One of my personal goals for the Credidero (reminder: it means “I will have believed”) project was to spend a month or so considering and researching a given topic, and then exploring how I felt about it, not just what I thought about it, to see if there were some new insights or perspectives for me, perhaps as articles of faith, or different lenses through which to view my world going forward. Somewhat ironically, this month’s “community” topic has been the hardest for me to consider and write, almost entirely because I’ve already spent so much time thinking about it and writing about it over the years that I already have a stronger set of well-formed beliefs on the topic that I’ve had on any of the others thus far.

How I act on those beliefs, though, I think is evolving, hopefully in ways that connect me more meaningfully with a more local or in-person communities, rather than spending so much time alone (in real life) while sort of together (in virtual space). I imagine that retirement, with all the newly available time it entails, will be a much richer experience that way. Less thinking and writing about community all by myself, and more experiencing community with others.

And on that note, I think I’m going to go sit out by the pool and see if there’s anybody to talk to . . .

A community of tree people and cyclists. More fun in person than online!

Note: This article is part of an ongoing twelve-part writing project. I’m using a random online dice roller to select a monthly topic from a series of twelve pre-selected themes. With this seventh article complete, I roll the die again . . .

. . . and next month I will consider Topic Number Four: “Complexity”

All Articles In This Series:

Credidero: A Writing Project

Credidero #1: Hostility

Credidero #2: Curiosity

Credidero #3: Security

Credidero #4: Absurdity

Credidero #5: Inhumanity

Credidero #6: Creativity

Credidero #7: Community

Credidero #8: Complexity

Credidero #9: Eternity

Credidero #10: Authority

Credidero #11: Mortality