Author Archives: johnfjeansonne@gmail.com

About johnfjeansonne@gmail.com

Newsday sportswriter emeritus and adjunct professor in the Hofstra University school of communications.

Not exactly deja vu

Might the Strauss-Howe Generational Theory—a psychohistorical supposition that describes recurring cycles—apply here? The return of the University of Missouri football team to the Gator Bowl during the recent holidays, though obviously a minor occurrence among life’s circumstances, nevertheless was a vivid ghost-of-Christmastime-past appearance for this old Mizzou alum.

In December of 1968, my senior year, I was one of two football beat reporters for our Journalism School’s Columbia Missourian daily newspaper. (Remember newspapers?) That fall, classmate Joe Rhein and I covered the footballers’ exploits in what—believe it or not—was our scholarly duty, our semester’s assignment for a reporting class. Which made the so-called “Missouri Method” of J-School—learning by doing—about as much fun as one can have without laughing. (Though we certainly had some yukks along the way.)

Rhein and I alternated the driving chores to away games in Kentucky, Illinois, Nebraska, Kansas and Oklahoma, mixed in with witnessing five home games. And when the gridders had the good fortune of being invited to a high-visibility post-season event in Jacksonville. Fla., Rhein and I were afforded special scholarships (valued at a whopping $100 apiece) to fly to the Sunshine State and chronicle the team’s preparation and bowl participation.

Over the years—many, many years—I continued attempting to commit sports journalism for Long Island’s Newsday, assigned to Super Bowls, soccer World Cups, Olympic Games, Triple Crown races, tennis Grand Slams, NBA playoffs, March Madness, World Series, Stanley Cup playoffs, on and on—during which I was introduced to various cultures, fascinating people and exotic locales. Still, that the ’68 Gator Bowl was an early step into the Big Time.

So to notice, more than a generation later, that the Mizzou lads were returning to that scene from 57 years ago conjured the old Mark Twain quote: “History doesn’t repeat itself, but it often rhymes.”

Almost everything about this sort of do-over was different. In 1968, the Gator was a big deal, one of only 10 major college bowl games. This winter, there were 41. That ’68 game was played in the original Gator Bowl stadium, demolished in 1994 after 48 years. But the Gator Bowl game, now burdened with one of those sponsored titles so common in sport’s relentless money grab, still is played on the same site, in what is now the NFL Jacksonville Jaguars’ palace, and has retained the west upper deck and ramping system from the original infrastructure.

The old joint, the one I attended in ’68 and again on assignment for Newsday to cover the 1980 Gator Bowl—because it featured that season’s Heisman Trophy winner George Rogers of South Carolina—had been home to the eponymous bowl game since 1946. That included the one in 1955 that was the first nationally televised bowl game.

The Beatles, during their first American tour in 1964, had played the Gator Bowl, though only after the Fab Four demanded that concert organizers nixed plans for a segregated audience. The Gator Bowl stadium also was host to the annual Florida-Georgia game, in which a football rivalry broke out amid the repeatedly raucous tailgating that caused the event to be known as the “World’s Largest Outdoor Cocktail Party.”

All of this history is just a reminder of how old I am. Yet that theory devised by William Strauss—author, playwright, theater director, lecturer—and Neil Howe—author, consultant, senior associate at the Center for Stategic and International Studies’ Global Aging Initiative—asserts that stuff that happened long ago comes back. Major crises and societal reconstruction, things like revolution and wars.

Sure enough, 1968 was a year of global upheaval, the Vietnam War’s Tet Offensive, the assassinations of civil rights leader Martin Luther King Jr. and presidential candidate Robert F. Kennedy, widespread anti-war and civil rights protests, violent clashes at the Chicago Democratic Convention.

And here, as 2025 slipped into 2026, we have conflicts in Gaza and Ukraine, political unrest regarding migration and the Trump administration’s autocratic, bullying leanings, what feels like unremitting gun violence, uncertainly regarding AI and robotics, struggles with climate disasters.

So Missouri was back in the Gator Bowl on Dec. 27, playing (and losing to) the University of Virginia, this time while I was thousands of miles away at my daughter’s place in London, and the 1968 date with Alabama came to mind:

First day in Florida, there in late December, we saw Santas roaming around in shorts, not what blow-ins from the Midwest expected. It was 70 degrees. Rhein and I waded in the ocean and did our duty reporting on all happenings related to the team’s game preparation, including the fact that—during the team’s leisure time at the beach, assistant manager Stan Biggs had his right eye blackened by a stray surfboard.

There was a pre-game banquet at which Missouri coach Dan Devine told his kidding-on-the-square joke on Alabama’s enormously successful and widely venerated coach, Paul Bryant, whom everyone knew as Bear.

“One night in the winter,” Devine said, “Bear had just gotten into bed and Mary Harmon”—Bryant always called his wife by her full maiden name—“said to him, ‘God, your feet are cold.’ And Bear said to her, ‘You can call me Paul.’”

Days before the game, Devine dismissed a key offensive lineman from the team—a player he called the best blocking center he had had—for what he called “unwillingness to abide by team rules” and never further clarified the offense. (Would that happen now?)

In the game, Mizzou rolled Alabama like dice in a surprising 35-10 victory that didn’t include a single completed pass by the winning side. The academic research of Strauss and Howe aside, that statistic may never be repeated.

Those were the days….

 

Here’s the chorus:

I loved life as we knew it/I still can’t believe we threw it away

Goodbye, that’s all there is to it/Life as we knew it ended today.

Sound like just another musical reference to a romantic relationship?

Or a lament of the moral, legal, judicial and physical destruction of American life triggered by the results of the 2024 presidential election? Perhaps a regret, put to music, for having installed in the Oval Office an authoritarian bent on cruel treatment of the disadvantaged, on stifling dissent and speech, targeting political opponents, pardoning criminally inclined allies, bypassing the legislature, using the military for domestic control, defying the courts, controlling the news media, intimidating universities, using his power for personal profit?

I loved life as we knew it/I still can’t believe we threw it away

Goodbye, that’s all there is to it/Life as we knew it ended today.

Here’s another ditty that seems to apply to the present:

Yeah, let’s impeach the president for hijacking
Our religion and using it to get elected
Dividing our country into color

And still leaving Black people neglected.

Fact check: The first tune indeed is about a love match gone sideways, written by Walter Carter and Fred Koller and recorded by Kathy Mattea in 1988. But the echo in there, loud and clear now?….Life as we knew it thrown away?

The other example, authored by Neil Young—Canadian-born naturalized Yank—is from 1973, the year that a blowhard real estate tycoon named Donald Trump, working for his father’s New York operation, counter-sued the U.S. government for $100 million (equivalent to more than $700 million now) over charges that Trump’s properties had discriminated against Black applicants and tenants.

If this should arise on a test, the answer is pretty clear that Young was protesting Tricky Dick Nixon’s misdeeds in the White House rather than demonstrating some sort of clairvoyance 50 years into the future. But the current, overwhelming march away from life as we knew it manifests itself as what some medical experts describe as an earworm, the inability to dislodge a song and prevent it from repeating itself in our heads.

I loved life as we knew it….

A John Prine lyric from a few years ago could also fit about now:

Some humans ain’t human
Some people ain’t kind
They lie through their teeth
With their head up their behind

And I’m also hearing in my head a catchy number recorded by Willie Nelson (and his friend Merle Haggard):

Now it’s all going to pot
Whether we like it or not
The best I can tell
The world’s gone to hell
And we’re sure gonna miss it a lot

Given Willie’s personal reputation for long endorsing the use of weed, there certainly is a whiff of marijuana there. But think bigger picture. The world’s gone to hell….

Music is a great thing, a soundtrack of our lives, our emotions and experiences. And not always uplifting. It can make you think.

Okay. Bottom line: I can’t sing. I pretended to play the guitar years ago; got a Beatles songbook with all the chords and so on. Like so many Boomers, I witnessed some terrific concerts, mostly enjoying the gigs by the likes of Pete Seeger and the sly Arlo Guthrie, Joan Baez. Dylan. Protest anthems. There were lots of anti-war songs around my college days, Kris Kritofferson’s “Good Christian Soldier” among the best.

‘Cause it’s hard to be a Christian soldier, when you tote a gun
And it hurts to have to watch a grown man cry
But we’re playin’ cards, writin’ home, an’ ain’t we havin’ fun
Turnin’ on and learnin’ how to die

I just read the obituary about a man named John Cleary, who had survived being shot in the chest by Ohio National Guard troops during an antiwar protest at Kent State University in 1970, a chilling moment in American history that suddenly doesn’t seem so abnormal, with ICE agents and the National Guard terrorizing citizens in Chicago and elsewhere. Back then, Neil Young weighed in…

Tin soldiers and Nixon coming/We’re finally on our own

This summer I hear the drumming/Four dead in Ohio.

It turns out that Neil Young is still holding powerful people’s feet to the fire, musically, with his rocking “Big Crime:”

Don’t need no fascist rules, don’t want no fascist schools

Don’t want soldiers walking on our streets

Got big crime in DC at the White House

There’s big crime in DC at the White House

 Chorus:

 No more great again, no more great again/Got big crime in DC at the White House.

 Not life as we knew it.

Preface

Someday I really do intend to write a book. Seems that ought to be a requirement for a person who has made a living as something of a wordsmith—44 years as a sports journalist for Long Island’s Newsday after four years of preparatory scribbling through college, plus another decade or so of freelance newspaper and magazine work. And now with my own Substack newsletter, (Mostly) Hot Topics—musings on journalism, geezerhood and current events (beware the occasional curmudgeonly inclination).

Lots of former colleagues—most of them, it seems—meanwhile have gotten around to writing at least one book.

I haven’t checked that box, and it feels like a failure of sorts. I think of my freshman-year college roommate, Skip. He had taped a large sign above our shared dorm room desk that lectured, “Procrastination is not an art.” Which didn’t prevent him from routinely putting off studies while he played music or cards or vacated the premises altogether in pursuit of a little relaxation.

But here I linger. It’s not as if I have writer’s block. I calculate that, over the years of newspapering, in attempting to produce information and possibly profundity on deadline, I cranked out as many as 200 articles a year, at an average length of 700 words—and not the very same words, either. Total, I’ve published something like 7 million such units of language. When one considers that the typical book runs from 70,000 to 120,000 words, that theoretically translates to 65 or 70 books.

Non-fiction. But, alas, disconnected and unbound. Not a single real book. I almost could write a book about never writing a book. First sentence: Call me indolent.

Part of the problem is settling on a topic. Anything biographical, along the lines of personal war stories—I use “war” only as a metaphor for personal frontline involvement—isn’t likely to get much traction. I have not chased any white whales; set out for California after being driven out of Oklahoma by drought, economic hardship and bank foreclosures; worked at a steamboat pilot on the Mississippi. A straight memoir is out of the question.

I’ve certainly come across some fascinating characters in my decades as a journalist, people who could do the work for me of storytelling, contributing humor and insight, relating rare experiences. But my brief encounters with book agents have left me with the impression that a subject is not especially marketable unless he or she already is a ragingly successful celebrity.

Except, when I proposed a book some 45 years ago on Al Oerter, who already had won four Olympic gold medals throwing the discus but, at the time, was 43—11 years past athletic retirement and attempting to resurrect some Olympic greatness at the 1980 Games—I was told that a publisher would only be interested if I could guarantee that Oerter would pull off a fifth Olympic victory. Entirely unlikely at his age.

Oerter was a piece of work—sportsman, philosopher, regular human being. An enlightening interview subject, entertaining and thoughtful. But only a happy ending would do?

There was an old football veteran named Ray Mansfield who, days before he and his Pittsburgh Steelers mates won the 1976 Super Bowl, dismissed the idea that only heroes should be fodder for a good book. “Winning is too serious, a serious business,” he said. A better volume, he said, “would be about the old, inept Steelers [from earlier in his career]….who were so much fun to be around.”

Forget prose based on facts, real events, and real people. Perhaps, instead, the working hypothesis might be to present something unusual, quirky, amazing, shocking. Emotionally gripping. A tome based on an adventure, a dilemma, establishing a mystery the reader would want to solve. Possibly shaped into an historical novel.

Where to start, though? I have read War and Peace—587,287 words and, to my mind, in need of a good editor to trim out about half of that verbiage. The first sentence, translated from the French, is a quote, “Well, my Prince, Genoa and Lucca are now no more than possessions, estates, of the Buonaparte family”—said to set the stage for the novel’s political and social themes at the start of the Napoleonic Wars.

More of a grabber, to me, is Kurt Vonnegut’s first line of Slaughterhouse-Five: “All this happened, more or less.” Or the “Notice” preceding the Introduction to Mark Twain’s Adventures of Huckleberry Finn: “Persons attempting to find a motive in this narrative will be prosecuted; persons attempting to find a moral in it will be banished; persons attempting to find a plot in it will be shot.”

That’s the ticket! So I’m jealous. Of Tolstoy, Vonnegut, Twain or anyone else with the discipline, planning and crafting of any form of literary work.

There is no white smoke here. Persons likely to come across such an output by this would-be author most likely will die of old age.

No kidding

Oh, the violence! The bloodshed!

House speaker Mike Johnson had characterized Saturday’s No Kings protests—there were more than 2,700 nationwide, with roughly seven million participants—as “Hate America” rallies that he said would unleash “Marxists, socialists, Antifa advocates, anarchists and the pro-Hamas wing of the far-left Democrat party.” Michigan Republican Lisa McClain portrayed the brewing events as guaranteed to feature a “mob of radicals.” Other Trump GOP toadies and kowtowing lickspittles predicted terrorist activity.

So there I was at the local No Kings gathering near my Long Island home, recalling the history of Edward R. Murrow’s dire greeting to American radio listeners in 1940 amid the Nazis’ 57-day bombing blitz of the British capital: “This is London.” Would we all live through it?

In reality, the two-hour gathering of some 3,000 people at Long Island’s Babylon Town Hall was about as destructive—as hostile, as murderous—as a bake sale. You could encounter far more danger trying to cross any local street overrun by speeding, lane-changing knuckleheads. What the Republican leaders envisioned—what they tried to sell—was their alternate reality of invading undesirables torching public property and inflicting injury. Something akin to—ahem—a January 6 storming of the U.S. Capitol by Trump supporters.

Talk about fake news. Here’s some of the stuff I witnessed in what was nothing more than citizens’ peaceful resistance to a decidedly unpopular President:

Most of the crowd toted signs, many creatively expressing an opposition to Donald Trump’s authoritarian policies and the corruption in his administration. “Deport Dictators.” “No Kings Since 1776.” “You’re Fired!” “Never Again Is Now.” “Orange Lies Matter.” “MAGA: Morons Are Governing America.”

Plenty of American flags were waved, with chants of “U.S.A! U.S.A!” along with other rejections of Trump’s depiction of their involvement, such as “I’m Not Being Paid to Protest.”

The folks lined the side of a busy highway, generating cacophonous cheering and horn-honking from the endless stream of passing motorists, who regularly lent raised fists, thumbs-up and applause in solidarity with the demonstrators. Many on-the-move observers raised cell phones at their car windows to record the thoroughly non-threatening action.

It was festive. Gregarious. The only bit of nastiness came from two—maybe there were three—fellows among the thousands driving by who brandished a middle finger to the rallying crowd. Those gestures were answered cheerfully with peace signs and drowned out by occasional chanting.

“This is what democracy looks like!” “No Kings! No Kings!” “Hey, hey, ho, ho; Donald Trump has got to go!”

A few posters paired the slang numerical term “to cancel” or “get rid of” with Trump’s place in the order of U.S. presidents: “86 47.” The day’s tone was nothing like “hate;” rather, it was concern. There was no haranguing or provocation. Just we vassals reminding of the country’s real social order. “These,” one sign announced, “are what patriots look like.”

Some protesters wore goofy inflatable costumes. Some sported “No Kings” sweatshirts. Most, like me, had gray hair. A 26-year-old guessed that more people from his age group “don’t care as much” about current realities, though he was quick to add that he was there because “I care.” An older fellow guessed at the thinking of those absent youngsters: “What’s democracy?”

The satirist Andy Borowitz “reported” that speaker Johnson “accused participants in Saturday’s No Kings protests of “blatantly exercising their First Amendment rights,” and that “when the framers of the Constitution wrote the First Amendment, they did not intend people to take it literally.”

Worse, what Johnson actually said at the conclusion of the No Kings rallies was that “there were a lot of hateful messages,” and that “we have video and photos of pretty violent rhetoric….saying fascists must die and all the rest.”

Somehow I missed all that. There were seven policemen on hand for the assumed apocalypse I attended. The most pressing duty required of any of them, in assuring that no lives were lost, was when one of the cops offered to press the traffic signal button to allow a couple of folks, unsteady with walking canes, to navigate a crosswalk.

Really.

They are Us

Here’s a fellow who can cut through the divisive noise drowning out discourse on virtually every level these days. He comes from the ultimate us-versus-them, zero-sum world—sports—a former all-America college football player, yet he could teach us—and the leader of our country—a thing or two about identifying with the other side; about negotiation, deliberation, compromise. In a word, empathy.

At 47, Dr. Cornell Craig’s job as Vice President for Equity and Inclusion at Long Island’s Hofstra University aligns with his belief that “they are us. What we do to others we do to ourselves.” His charge is to ensure equal opportunity for all Hofstra’s students and staff at a time when the Federal Administration is targeting DEI (Diversity, Equity and Inclusion) efforts. If Craig were to speak to Donald Trump, he said, he would make the point that “in any area where I’m not benefiting from privilege, I’m connected to everyone else.

“Not that we’re the same,” he said, “but where I’m challenged, where I’m struggling, where I’m marginalized, that connects with where you’re marginalized, where you’re struggling. Instead of isolating us, that really should be a bridge to other people: You’re struggling; I’m struggling. Let’s connect on that level and help each other out. Too often, the place where we struggle builds walls: That you’re not struggling like me, you’re not feeling what I’m feeling, you’re not experiencing what I’m experiencing….

“So I’d want to reinforce that to our president. Everyone’s experience is not your experience, but those other experiences are valid. All the experiences are real. If we can just appreciate that as people.”

Entirely too reasonable, no? In a “press conference” experience for the students, Craig recently spoke to my Hofstra sports journalism class. I had sought him out as a hot topic in both the public’s long-standing interest in athletic success and the current front-page Trump campaign against widespread opportunity.

A former star wide receiver for Southern Illinois University; a learned man with three degrees from three colleges; a part-time poet and philosopher, Craig offered insights about athletes’ adjustment to retirement, about personal experiences as a Black man in a mostly white world, and especially about fair treatment to all.

Listen: “My experience in athletics really helped me in understanding dedication, commitment, putting in time, knowing you don’t start at the top but you can work your way to the top,” Craig said. But also, “As far as what influenced my non-athletic professional career, it was engaging my own experience as a Black male in the U.S., being at a predominantly white university and, while I got a lot of privilege as a student athlete, there still were other parts of my experience that I could relate to being marginalized and relate to the experience of others.”

He called the “history of Hofstra as the first fully accessible campus for people in wheelchairs very important” to his situation—the school’s realization in 1981, nine years before the Americans with Disabilities Act was passed, that simply getting around campus is something too often taken for granted. That particular awareness of “marginalization, of student disabilities and their safety, opened my eyes to others’ experience.”

Maybe, he acknowledged, the fact that the U.S. government has ordered probes of organizations practicing DEI and has pressured foreign companies with U.S. government contracts to comply, means Craig’s job is on shaky footing. Hofstra’s president, Susan Poser, has been vocal in supporting DEI, as has New York governor Kathy Hochul. “But you never know,” Craig said. “There are so many things you would have assumed a year, or two years ago, that never would happen are happening.

“If the government says, ‘You’ve got a DEI office so you’re not going to get federal money’….we would close this office.”

But his philosophy—his advice to students—is “doubt less. A guest speaker at one university where I worked said, ‘For great harm to be done, there needs to be great distance.’ Emotional distance, psychological distance, spiritual distance. That person is way over there, so that doesn’t impact me. Or this person doesn’t believe what I think, so it doesn’t matter.

“If we can close that distance…Those people are still human, so we can reduce the harm.”

He is the son of an NFL defensive back—Neal Craig played for three teams in the 1970s—and once thought he also would be part of the same world. That it didn’t come to pass also factored into his understanding of those who were Left Out. “That transition,” he said, of “searching for an identity outside athletics….from not having to introduce myself, from not going into a room and having people know who I was, took some time, some introspection.”

Along the way, with an undergraduate degree in communication, a keen interest in philosophy and the dissertation he wrote on the landmark 1896 Plessy v. Ferguson Supreme Court decision that racial segregation did not violate the U.S. Constitution, Craig settled on his belief that “it is about giving people a chance, about getting things out of the way from people having access.

“Jackie Robinson is presented as breaking the color barrier. What really happened is that Jackie Robinson was allowed to play. It was like ‘no one was good enough to that point,’ but, really, there had been a gentleman’s agreement” among baseball’s white ruling class “to keep some people from having access.

“Putting my philosopher’s hat on, the thing that connects humanity is the human experience, the frailties of the human condition—those that separate us, that one group’s better than another, that you’re less than I am. No. Those things that connect us lift us up.”

Believing is seeing

This is a full-throated tribute to eyeglasses. My first pair, when I was a freshman in high school, did wonders for my jump shot. Now, just a few years on, I offer a big shout out to Benjamin Franklin, the visionary whose keen insight led to the invention of—among other things—bifocal spectacles.

The familiar story is that Ol’ Ben, as he aged and experienced deteriorating eyesight, found he couldn’t focus on objects right in front of his face without constantly having to alternate spectacles—one pair for distance, the other for reading. Same thing happened to me just recently, after undergoing cataract surgery in each eye.

The surgeries were a rollicking success. Colors are dramatically more vibrant, the surrounding world somehow more alive. No complaints here whatsoever. Except there was a period of several weeks after those procedures—until I could get a new bifocal prescription to offset the altered visual acuity brought on by the operations—that I was back in Franklin’s early days of the 1780s, able to see fairly well at a distance but in dire need of a reading lens.

More to the point, spoiled by years of having taken bifocals for granted, my frustration with a one-pair-on, one-pair off shuttle of glasses had me feeling foiled again and again. Franklin—this was a guy who created the lightning rod, the Franklin Stove, the odometer, the position of postmaster to develop efficient mail delivery routes in his city of Philadelphia, swim fins (swim fins!)—solved the issue with his “double spectacles” innovation: Cut in half the lenses from two different pairs of glasses, combine them in a single frame—top half to see far-away objects, bottom half for up-close viewing. Voila!

I have read that, around the turn of the 20th Century, the monocle—a single-lens eyeglass which required the wearer’s eyebrow and cheek muscles to hold it in place—had become not only a significant aid for reading but also, somehow, was widely considered a decorative fashion accessory. But the monocle soon got the side eye from enforcers of popular trends in personal adornment. Or maybe folks’ eyebrow and cheek muscles needed a rest.

So. Herein a new appreciation for Franklin—that grand American statesman, author—and for one of the electrifying discoveries attributed to him.

It should be noted that I never was put off by the long-ago youngsters’ schoolyard insult of “four eyes,” a form of bullying all glasses-wearers as “outsiders.” The sticks-and-stones-will-break-my-bones retort was pretty effective. And, hey: Clark Kent wore glasses. As a “disguise,” yes, and one intended to render him a bit of a meek nerd, but we all knew he was Superman. (There is a website for Banton Frameworks, a United Kingdom-based designer of eyeglass frames, that chronicles the various spectacle styles of all the actors who have played Clark Kent/Superman in the movies and on television. My frames are probably closest to what the actor George Reeves wore in the old 1950s Superman TV series. Maybe a bit out of vogue….)

Listen: Lots of famous people are distinguished by their choice of eyewear: John Lennon, Harry Potter, Elton John, Gandhi, Buddy Holly. Not quite as many women come to mind, which conjures the long-out-of-date line from Dorothy Parker’s 1926 poem “News Item:” “Men seldom make passes/At girls who wear glasses.” (The 1970s song “Bette Davis Eyes” is not about her glasses.)

Anyway. Happy to experience how various eyesight problems can be dealt with, and more than glad to acknowledge Ben Franklin’s contribution. A man who figuratively could read a room, see the forest and the trees.

Over-inflated?

So a Tom Brady statue was unveiled last month outside the New England Patriots football stadium, the old quarterback depicted with a right arm raised in triumph. The thing weighs 12,300 pounds and stands 17 feet tall, but appears a bit out-of-proportion; the head is too small, floating above all that padding.

The obvious intent was to glorify the seven-time Super Bowl champion, so it might have been an opportunity to embody a hackneyed modern sports cliché by sculpting Brady’s little noggin on the body of a goat. Anyway the effigy, which is a poor man’s Michelangelo’s David with clothes, feels excessively worshipful of a fellow whose most significant impact on humanity was throwing a football—accurately, yes, but just a football—and therefore maybe a tad over the top.

Not that such a rite is unusual. There are massive bronze renditions of accomplished jocks in abundance—from baseball’s Babe Ruth to golf’s Tiger Woods, boxing’s Oscar de la Hoya to football’s Johnny Unitas, soccer’s Diego Maradona to hockey’s Wayne Gretzky, as well as sculptures of coaches and sports executives—most having been unveiled while the actual human being was still alive.

But the argument here is that such forms of adoration are better reserved for long-dead figures—therefore not feeding on the objects’ self-importance, as if they are being canonized, somehow representing a purity of virtue impossible for any human being to live up to.

Penn State’s Joe Paterno had been feted for his wildly successful football coaching record with a bronze statue on campus in 2001—only to have it ignominiously removed and hidden away 11 years later. It was judged to have a become a “source of division and an obstacle to healing” after Paterno was found to have covered up allegations of child sexual abuse by his veteran assistant coach. Possibly if the school had waited until Paterno’s complete history was available, and he was safely in his grave, before considering affording him such an honor.

A recent essay by Sally Jenkins in The Atlantic pondered a better use of sports-related sculptures—as representations of something beyond the individual’s specific accomplishments on the playing field. First of all, she noted, “Of all the public indignities great athletes are subjected to, from the meme to the boo to the hurled bottle, undoubtedly the worst is the bad statue. A bronze figure in a stadium plaza is so much more permanent than an insult, and the irony is that a Dwyane Wade or a Michael Jordan has to accept the thing as a compliment. The statue’s intent is to immortalize. Instead, it kills its subject dead.”

It is a common slur, after all, to describe any athlete’s resemblance to a statue, thereby invoking the image of being frozen-in-place while action swirls around him or her.

Jenkins argues that “only one truly great bronze rendering of a renowned athlete [that was] produced in recent decades is the abstract” of tennis champion Arthur Ashe at New York City’s National Tennis Center—which “surges from the earth like a lightning bolt striking upward instead of down. The sculpture, unveiled by the artist Eric Fischl in 2000 and titled Soul in Flight, is worth pausing to look at, for its instructive power and its indictment of the ponderous slabs of metallurgical debris that litter other stadiums and arenas.”

That statue isn’t really a rendering of Ashe, and is not so much lionizing his reign among jocks as a visual of wider possibilities. He was the rare athletic champion who actually connected with the real world—an activist against South African apartheid, a public face in the fight against HIV (which he had contracted through a blood transfusion after a second heart attack), an advocate for children’s education, a published historian.

There is, by the way, a statue in Richmond, Va., that captures a real-life image of Ashe, holding a tennis racket in one hand—but with a message beyond sports. Ashe is surrounded by children, with a stack of books in his other hand. It’s another tableau of wider possibilities.

Meanwhile, there happens to be a rare memorial willing to immortalize a star athlete’s infamous moment. In 2012, six years after French soccer hero Zinedine Zidane was ejected from the World Cup final for headbutting an Italian opponent, that confrontation was cast in bronze and placed in Paris. Zidane had been ejected from the game for his misdeed and France lost the match. The statue was christened “Coup de tete”—“Headbutt.”

Its sculptor, Algerian-born French artist Adel Abdessemed, said the aim of his work was to promote conversations about “stress on athletes…and the importance of dealing with issues of mental health.” Real-life stuff.

What if—in the spirit of sports’ (and human) imperfection, of the undeniable temptations to win-at-all-costs—the new Tom Brady statue had showed, in his upraised hand, an air-deficient football, recalling the January 2015 AFC Championship controversy over allegations that Brady had ordered deliberate removal of air from game footballs to aid his passing in New England’s victory over Indianapolis? Brady wound up being suspended for the first four games of the following season and his team was fined $1 million and forced to forfeit two 2016 draft picks. That’s part of his record, too.

They could call the piece “Uninflated.”

A different sort of tennis backhand

The recent U.S. Open tennis dust-up between American Taylor Townsend and Latvia’s Jelena Ostapenko—kicked up by Ostapenko over a virtually meaningless gesture that somehow has become fairly common in the sport—begs for the insight of the late Bud Collins.

For more than a half-century before his death at 86 nine years ago, Collins was tennis’ premier historian and conscience, his informative writing and commentary brightening newspaper, magazine and television accounts. He employed humor based on his feeling that “sports wasn’t the end of the world.”

He conjured wonderful nicknames for players, mixed his even-handed and sometimes critical—but never mean—reporting with lighthearted irreverence and sly puns. On one occasion, after an Israeli tennis pro named Shlomo Glickstein executed a winning shot and NBC-TV immediately followed it with a replay, Collins alerted viewers: “Here’s Schlomo in Slow-Mo.”

Collins’ powers of observation included citing the otherwise unnoticed bottle of liquid tucked in the umpire’s chair at Wimbledon, ostensibly to refresh any player in need. In all his years at the tournament, Bud confided, no player had ever touched the stuff.

Bud was known by one and all, thoroughly recognizable with his bald head, big smile, sweater thrown jauntily over his shoulders and pants with patterns so loud they could speak for themselves. A colleague among us sports journalists, Bill Glauber of the Baltimore Sun, once referred to him as “the human press credential,” able to move as freely around those crowded, chaotic competition venues as a Rod Laver or Billie Jean King.

So: What might Bud have said about the Townsend-Ostapenko fracas, in which Ostapenko accused Townsend of failing to acknowledge a lucky bounce during a straight-sets second-round victory? The transgression in question was the routine by a player, benefitting from a so-called net-cord winner—when his or her ball ticks the top of the net and falls, unplayable, to the opponent’s side of the court—offering a (hardly sincere) apologetic gesture of a raised hand or raised racket to the point’s victim.

It was Townsend skipping that little nicety that apparently set off Ostapenko—that and what Ostapenko characterized as Townsend rudely starting their pre-match warmup at the net rather than the baseline. After the match, Ostapenko shook a finger at Townsend and accused her of having “no class” and “no education.”

(Several of her fellow pros noted Ostapenko’s documented history of questionable gamesmanship. In the Townsend match, she appeared to be messing with Townsend’s concentration by taking a lengthy bathroom break, begging a timeout after a lost game and challenging an electronic line call. Plus, there were the condescending optics of Ostapenko, a white Latvian, publicly scolding Townsend, who is Black, about “class” and “education.”)

All this as if Townsend, or anyone else, could be capable of engineering a shot that would catch just enough of the net cord to stop, think, then drop barely into Ostapenko’s side of the court.

That’s the sort of thing Bud Collins surely would have used to put the situation in perspective, as well as adding some real education about the sport and its history. Bud had published, in 1980, a 665-page “Modern Encyclopedia of Tennis,” and in that tome—along with exhaustive accounts of the sport’s notable players and matches—is a detailed glossary of the sport’s rules and all manner of “tennis lingo.” There are basic definitions, everything from “ace” to “Wimbledon,” including what a “net cord” is.

Better, and more up Bud’s alley, is the description of the no-longer extant “net judge”—“An official,” the encyclopedia clarified, “seated at one end of the net, usually below the umpire’s chair, to detect ‘lets’ on serve. During the serve, he rests one hand on the net cord to feel whether the ball hits the top of the net. If it does, he calls ‘Net!’ and the serve is replayed as a ‘let’ if the ball lands in the proper court.”

Bud always called that official “Fingers Fortescue,” though there clearly were different people, men and women, on duty at different events. Alas, technology since has eliminated the job and, anyway, the Ostapenko complaint wouldn’t involve the devoted work of “Fingers” because a ball catching the net already in play—as opposed to one on a serve—remains in play.

Furthermore, Ostapenko’s claim that “there are some rules in tennis which most of the players follow and it was [the] first time ever that this happened to me on tour” clearly mischaracterized previous acts of what is, at most, merely a courtesy. There certainly is no rule requiring the bit of politeness, which hardly is a real apology for winning a point—what former major tournament champion Svetlana Kuznetsov has called a “typical ‘Sorry; not sorry.’”

“Here’s the thing about tennis etiquette,” Aron Solomon posted on the Tennisuptodate website. “It’s not the Magna Carta. There’s no line in the rulebook requiring you to murmur ‘sorry’ after a net-cord dribbler….What is in the rulebook—and what the sport actually has to enforce—is the simple idea that you can’t verbally abuse your opponent.”

In the end, Ostapenko issued what she described as an apology for her outburst, pleading that “English is not my native language, so when I said ‘education’ I was speaking only about what I believe as tennis etiquette….”

Now that is a “Sorry, not sorry.” And Bud Collins would have added a pertinent, probably humorous anecdote. And possibly a quote from Fingers Fortescue.

Streetwise

One way to navigate London roads—those zigzagging, mostly narrow paths often without their names posted at intersections; the one ways, the dead ends, the traffic circles—is to follow a 5-year-old grandson on his bike from his school to the local playground and then to his home. That works quite well. But that covers only about 6 miles, and London has 70,000 streets—just a few of them straight—that traverse roughly 9,200 miles.

There is a website, nextvacay.com, that judges London to be the second most difficult city of negotiable roadways in the world—only No. 2? With Toronto first?—and there is no argument here.

This is not to disparage the United Kingdom’s capital. London is a swell place, diverse and alive. Aside from the abundance of must-see attractions, there are parks and playgrounds in abundance; everywhere are runners, bicyclists, children, dogs. Outdoor marketplaces and pubs bustle.

There is the possibility of a night at the theatre—equal to New York City’s Broadway fare—a day at one of many museums or an uplifting classical music concert at St. Martin-in-the-Fields. A lunch of fish-and-chips. Such touristy activities as walking the zebra crossing on Abbey Road. Going to the Tower of London. Checking out Big Ben. In early summer, there are afternoons at the pub watching Wimbledon tennis on outdoor big-screen televisions in a garden setting.

But, the truth is, you wouldn’t want to drive in London. (I did—many, many years ago—but I learned my lesson.) To get around, those familiar red double-decker buses are handy and efficient—if you know the number of the bus to board for your destination. The Tube or Underground—in existence since 1863, with 11 lines that cover 250 miles in 32 boroughs—is really nifty, though sometimes one of the 272 stations is a healthy walk away.

Walking, in fact, is a great option. But, even on foot, it’s easy to lose one’s sense of direction. Unlike, say, New York’s central borough of Manhattan, London is not laid out on an easily navigable grid. London’s streets are wormy; they don’t go North/South and East/West. And they’re quite narrow because they existed long before automobiles did. A bus ride in London—especially viewed from the upper deck—is an eye-roller of an adventure, providing an impressive, and sometimes unnerving, insight into the need for precise skills required with motorized traffic in confined spaces.

I have now been to London 27 times. On vacations, on assignments as a Newsday reporter, back when our daughter chose London for her NYU semester abroad, and mostly in recent years since she settled there with her Scottish-born husband and their young son. And I’m still not especially confident that, on my own, I could find my way from Paddington Station to Regent’s Park. Or between any other pair of sites.

I certainly don’t have, and only recently have read about, The Knowledge, derived only in a severely demanding three-year-long process put to prospective London taxi drivers, whereby they successfully commit to memory every street, address and landmark in the city. Test-takers have been asked to name the whereabouts of flower stands, laundromats, commemorative plaques.

The London street map has been described as “a mess….a preposterously complex tangle of veins and capillaries, the cardiovascular system of a monster….’’ Peter Ackroyd, author of the 800-page “London; the Biography,” has written that Londoners themselves are “a population lost in [their] own city.”

According to a New York Times report, the trial a London cabbie endures to gain The Knowledge “has been called the hardest test, of any kind, in the world. Its rigors have been likened to those required to earn a degree in law or medicine.” One fellow claimed to have logged more than 50,000 miles on a motorbike and on foot to win his cabbie’s badge.

There was a cognitive neuroscientist who studied the human hippocampus, an engine of memory deep in the brain, and found that the hippocampus in London cabbies would grow and be strengthened like a muscle. After that study was publicized, a London cabbie named David Cohen told the BBC, “I never noticed part of my brain growing. It makes you wonder what happened to the rest of it.”

This may be small-minded: I’ll just follow that young lad on his bike and enjoy the show.

When is running no longer age-appropriate?

 

I know a 78-year-old man whose thoughts during his morning runs sometimes include wondering how much longer he’ll be capable of sustaining that exercise routine. Not surprisingly, a form of leg oldsheimers is setting in as 80 approaches. Power walkers often pass him by. Some things just don’t get better with experience.

But a habit is a habit. And what doesn’t kill a person is meant to make him stronger, no?

OK, about running and death: In the early days of the running boom, when it became clear that ordinary people could safety attempt long distances on foot, a fellow named Jim Fixx died of a heart attack in the midst of his daily jog.

The irony was that Fixx had been something of a jogging/running drum major. In 1977, he had published a best-selling book, The Complete Book of Running, and thereby was a key missionary in the American fitness revolution. His own running regimen had transformed him from an overweight 214-pound, two-pack-a-day smoker to a healthy, happy dude.

But it turned out that his earlier habits and genetic predisposition did him in. At just 52.

In 2007, during the U.S. Olympic Marathon Trials, 28-year-old Ryan Shay—among the nation’s elite runners with a handful of national titles to his name—collapsed and died just 5 ½ miles into that race. Autopsy results were inconclusive, but Shay reportedly had a lurking pre-existing condition, an enlarged heart.

Amby Burfoot, a former Boston Marathon champion, argued at the time that “marathoning is remarkably not dangerous. I’m biased but, to me, obesity is a much greater health crisis than marathoning.” Burfoot still runs daily. He’s 78, too.

That other 78-year-old, the one cited at the top of this discourse—he once ran a couple of marathons but never was close to championship material—had been informed two decades earlier by a just-widowed neighbor that his wife used to worry that the guy passing on his daily jaunt was going to kill himself. A twist of fate, that.

And, for him, there have been interrupting, albeit unrelated, health issues along the way. Thyroid surgery. Brain surgery. Valve-replacement surgery. Some potential skin-cancer issues. Still, the daily perambulations continue to appeal. Hard to say exactly why. And people who do this sort of thing typically are not of an evangelical bent.

The uninitiated tend to see runners as either inspirational or, more likely, crazy, and it’s best to leave it at that. But the number of Americans who regularly participate is reportedly around 50 million—whether casually or competitively, solo or in a group, and in any of those forms the practice is widely considered an ideal means of releasing stress and maintaining fitness.

Beginning in the 1970s, running began spreading like a communicable disease; the bug was caught by hundreds, then thousands, of ordinary folks—including the 78-year-old geezer referenced above. Citizen road races and marathons sprang up, drawing increasing crowds, giving lie to the expression associated with a 1959 short story, The Loneliness of the Long-Distance Runner.

A primary influencer of the movement—a good word for it—was Frank Shorter, whose televised 1972 Olympic marathon victory began to spread what Shorter has benignly called a “disease.” (At 77, Shorter still has the virus.)

Now, consider Fauja Singh of India. At 89 years old, with a scraggy beard that reached his chest and attired in a yellow turban, Singh returned to his youthful passion of running and, over the following eight years, completed nine full marathons—26 miles, 385 yards—and, living in London, also bettered UK age-group records at 200, 400, 800 and 3,000 meters as well as the mile. Leading up to the 2012 London Olympics, he was among the torch-bearers of the Olympic flame. His last marathon came when the Turbaned Tornado was 100, making him the oldest known finisher at that distance in history.

Singh died earlier this month, at 114. But running didn’t kill him. He was on a walk in his native village of Pujab, hit by a car in a hit-and-run incident.

So tenure guarantees nothing. But that man I know intends to persist with his daily excursions on foot.