Saturday, February 14, 2026

The Surrender of Fort Fillmore

Perhaps the first thing you should know about this little-known Civil War episode is that the geography is the villain of the story—or, at least, a co-conspirator.  People love to imagine Civil War battles as neat little arrows on a tidy map.  Down here in the Mesilla Valley, the map itself has been wandering around like a drunken steer.  The mountains behave; the river does not.

And then there’s the border.  When Fort Fillmore’s drama unfolds in July 1861, the “southern New Mexico” you know today is still the new wing of the house—added via the Gadsden Purchase in 1853–1854, which means the international boundary everyone takes for granted had been settled, locally, for only about seven years.  That short timeline helps explain why some old maps look a bit… aspirational.

In 1848, the little community that became Mesilla was established west of the Rio Grande (and along El Camino Real).  But the Rio Grande is famous for behaving like a living thing—shifting, braiding, cutting new channels during big floods, and (at times) flipping what people think of as the “east bank” and “west bank.”  One technical summary of the 1860s flooding notes the river cut a new course that left Mesilla on the river’s east bank, and other local histories point out yet another course change later that helped produce the river’s present-day position.

That’s why old descriptions can sound like they’re contradicting one another when they’re actually describing different decades of a restless river.

Now, about Fort Fillmore, itself: if you’re picturing towering stockade walls and a gate you could drive a stagecoach through—nope.  Fort Fillmore was a typical southern New Mexico “fort”: a cluster of adobe buildings, arranged around a central space, with one side open toward the Rio Grande.  A visitor in the 1850s even described it as “large and pleasant,” with comfortable adobe quarters.

Fort Fillmore was established in 1851, across the Rio Grande from La Mesilla to protect travel and traffic through a corridor that connected settlements, trails, and commerce, and that also drew Apache raids and other violence common to the era.

So, keep that mental image in mind: Fort Fillmore was not a compact stone castle, but crude adobe structures in open desert country that was surrounded by nothing but tumbleweeds—hardly an ideal place to absorb a determined attack, especially from mounted men who could choose their angles of attack.  This was a fort that you could demolish with a good garden hose much less a howitzer.

This may sound like the punchline to a bad joke, but it’s mostly a story about risk management.

Fort Fillmore was not built near the river, but on sand hills above it—a choice that made sense if you feared floods and wanted slightly higher, drier ground.  The problem is that a river that shifts can turn “near” into “not near” with alarming speed.  One widely repeated summary notes that after the Rio Grande changed course, the fort ended up being about a mile from the river and had to be supplied by water wagons, which, in turn, made it harder to defend in a crisis.

In other words: it wasn’t built where there was no water so much as it was built where water was close enough—until it wasn’t.

When the Civil War begins, the U.S. Army in the far Southwest is thinly spread, and everything is held together with small detachments, long supply lines, and optimism.  In July 1861, Confederate forces from Texas under Lt.  Col.  John R.  Baylor move into the Mesilla area.  Baylor’s men are mounted, aggressive, and comfortable in desert campaigning.

At Fort Fillmore, the Union commander is Major Isaac Lynde, with several companies of infantry regulars and attached elements—enough to look respectable on paper, but not enough to feel secure when you’re staring at mounted opponents, on a jittery frontier, and insecure in the knowledge that your supply lines have been cut and that the rest of the nation’s attention is a thousand miles away.

Lynde marches out toward Mesilla, where Baylor’s men are positioned.  The confrontation becomes what you might call a “confidence test,” and Lynde does not pass it, despite having more soldiers than the Confederate force.  After a short fight kills three union soldiers, Lynde falls back to Fort Fillmore.  This small battle is known as the First Battle of Mesilla.  (There was a second battle about a year later, but it was so small that no one is sure exactly when it happened.)

This is the hinge point: once he returns to the post, Lynde has a decision.  He can try to defend his mud fort or abandon it and try to save his command by moving north towards another Union fort.  It was not much of a choice, so Lynde orders the soldiers to prepare to abandon the fort.

Now here’s where the story gets a little hazy.  As part of the preparation to leave, Lynde orders that all of the fort’s stores that couldn’t be evacuated are to be destroyed.  Whether the fort had a large stock of medicinal brandy or the Sutler’s store was oversupplied with whiskey is a mystery.  What is known is that many of the soldiers decide that the best way to destroy the liquor is to run it through their systems.  Many of the soldiers choose to fill their canteens with whiskey. 

Perhaps they were worried about snake bite?  As W. C. Fields said, “Always carry a flagon of whiskey in case of snake bite.  Always carry a snake in case of thirst.”

Lynde wants an orderly retreat, leaving the Mesilla area and heading east into the Organ Mountains and the only source of reliable water nearby—the springs in the San Augustin Pass, about 20 miles distant.  From there, they could move north towards Fort Union. 

That was the plan.  In practice, it turns into a slow-motion collapse.  Men fall out, heat punishes them, the column straggles, and the mounted Confederates enjoy the luxury of mounted pursuit while the infantry fights the desert as much as any enemy.  Southern New Mexico in July is as hot as a pawn shop pistol.  The heat is stifling even in the shade and there ain’t no shade.  In the middle of a New Mexico summer, I’ve seen trees chase dogs in hope of relief.

Baylor splits his forces, sending half through a narrow mountain pass that now bears his name.  While his men are mounted, Lynde’s troops are on foot, struggling in the heat and are beginning to suffer from the effects of their canteens.

By the time Baylor closes in near the San Augustin Pass/San Augustin Springs area, Lynde’s command is demoralized and scattered enough that the surrender becomes, in Lynde’s mind, the least-bad way to avoid slaughter.  He surrenders without a climactic last stand. 

Baylor plays this well.  He pressures, pursues, and presents Lynde with the sense that resistance will only mean pointless casualties.  Lynde yields.  Baylor has captured a Union force in spectacular fashion, and the Confederacy suddenly has a foothold in the region strong enough for Baylor to proclaim a Confederate “Arizona” government soon after, with Mesilla as its capital. 

Lynde’s surrender detonates his career.  He is disgraced, and the Army moves harshly against him.  A War Department order drops him from the rolls “for cowardice,” effective the date of the surrender. 

Baylor rides his victory into power.  His proclamation and early Confederate control in the region make him briefly prominent.  But Baylor’s story also curdles.  He is removed from authority later after issuing an infamous order calling for the extermination of the Apache people—an act so extreme that even Confederate leadership moved against him. 

Today, Fort Fillmore is not only forgotten, but it has almost completely vanished. Where the fort once stood is a large pecan orchard where the grounds have been expertly leveled to conserve the precious irrigation water.  All that is left is the fort cemetery, located about half a mile southeast.

We need some way to commemorate this battle.  Since commemorative runs and walks are the national hobby now, let’s do what any responsible civilization would do: every July, we should stage an annual Fort Fillmore Whiskey Run.

Participants will begin with the traditional gesture—all water bottles will be confiscated and replaced with a pint of whiskey—then participants will set off to recreate Lynde’s finest hour: twenty miles of ambitious decision-making through the desert and march up into the Organ Mountains.  Finishers will be rewarded with access to the springs, which is a lovely touch of historical authenticity, except for the small complication that the springs inconveniently dried up around 1950.  Still, details, details.  History is built on them and is then immediately trampled by them.

Three Centuries of Royal Scandals

Andrew, the royal reprobate formerly known as Prince, is the first senior member of the royal family to be arrested since Oliver Cromwell caused Charles I to get an extremely low haircut.  After the all-but-deadly-dull reign, at least morally, of the last two English monarchs, it is easy to forget that sexual scandals and assorted peccadillos are associated with almost every branch of the noble family tree. 

Let’s review:  The current royal family started about 300 years ago when Parliament ignored 50-odd closer (although Catholic) relatives of Queen Anne and imported a distant (but Protestant) German-speaking George I.  (technically, it was 312 years ago, but 300 is close enough for conversational warfare.)

George I (r.  1714–1727): “I came for the crown; I stayed for the mistress”

George I arrived from Hanover with two main hobbies: being king and not being married in any meaningful emotional sense.  His wife, Sophia Dorothea, became the star of one of the era’s grimmest “relationship outcomes”: separation, scandal, and long confinement.  

George’s marriage to Sophia Dorothea of Celle was a dynastic arrangement that curdled into open hostility.  By the early 1690s, the story goes, she’d fallen into a dangerous romance with Philip Christoph von Königsmarck, and the pair began plotting the one thing a court hates more than infidelity: escape.  Then, in early July 1694—after a late meeting in Hanover—Königsmarck vanished as neatly as a secret dropped into a river.  (Sources bicker over the exact date, but they agree on the result: Königsmarck was professionally disappeared.)

What followed was less romance novel and more administrative cruelty.  George pushed through a divorce that assigned Sophia Dorothea all the blame, stripped her of status, barred her from remarrying, and—most viciously—cut her off from her children.  She was sent into lifelong confinement at Ahlden House, effectively a “respectable” prison, where she remained until her death decades later.

If you’re keeping score, this reign sets the tone: the monarchy is now British, but the marital peace is… multinational.

George II (r.  1727–1760): “The mistress is a job, and it comes with a pension”

George II and Queen Caroline were, by royal standards, a functional partnership: she supplied the brains, the charm, and (when he wandered off to Hanover) the competent adult supervision as regent, while he supplied the temper, the uniforms, and the firm conviction that fidelity was a charming folk custom practiced by lesser people.

And yes, he kept mistresses—because in that court, a mistress wasn’t always “a scandal” so much as a semi-official office, complete with access, allies, and enemies. One of his earlier favorites, Henrietta Howard, even served in Caroline’s household, which is the sort of arrangement that makes you suspect the Georgian court ran on powdered wigs, port, and spite.

His most famous late-career “department head” was Amalie von Wallmoden, Countess of Yarmouth—a Hanoverian import who didn’t just get the king, but got a life peerage in 1740, neatly converting adultery into a title you could print on calling cards.  In a world where access was currency, that made her a gravitational body: politicians orbited, rivals hissed, and pamphleteers sharpened their quills with the usual insinuation that patronage, policy, and pillow talk all lived in the same suite of rooms.  Rumor even assigned her an illegitimate son by the king—exactly the kind of story that doesn’t need to be proven to be useful, profitable, and repeatable.

Think of it as an early form of government: the Crown, the Cabinet, and the Side Piece.

George III (r.  1760–1820): “A domestic man trapped in a family business”

George III is the palate cleanser in this menu.  He was known for being comparatively devoted to Queen Charlotte, producing an impressive number of legitimate children, and generally giving the nation fewer bedroom bulletins than it had come to expect.

His greatest “scandal,” if you must call it that, was the painful fact that the King frequently talked to trees and was barking mad.  At one point, he believed that he was George Washington leading an army against himself.  In short: less randy, more tragic, and arguably the last time Britain said, “Ah, finally, a normal one.”

George IV (r.  1820–1830): “The Regency: now with extra Regency”

If George III was the calm, George IV was the compensatory storm.  As Prince Regent, he specialized in overpriced luxury, drama, and romantic chaos.

George IV (a.k.a. “Prinny” when the knives were out) managed to turn the monarchy into a traveling show of appetite, debt, and romantic arson.  Before he was even king, he secretly married Maria Fitzherbert—an officially unacceptable match—then watched his allies publicly swat down the rumor when it became inconvenient while begging Parliament to cover his exorbitant debts.  From there he lurched into the spectacularly unhappy marriage to Caroline of Brunswick and when he wanted out, he effectively tried to weaponize Parliament of the United Kingdom into a divorce court, sparking a public circus of accusation and counteraccusation so lurid it came with paperwork.

Meanwhile, the popular press and caricaturists treated him like a walking moral lesson.  Cartoonists didn’t just draw him as bloated—they helpfully surrounded him with the sort of “medical” clutter that screamed venereal panic (the Georgian-era visual equivalent of yelling “pox!” in a crowded theatre).  And the “madness” angle wasn’t just a cheap jibe: he became Prince Regent because George III was incapacitated by severe mental illness.  By the end, the punchline turned grimly physical—corsets, dropsy, gout, and enormous doses of laudanum and opium to blunt the pain—less “divine right” than “medicated decline.”  This is the era that convinces people the monarchy is a soap opera with better furniture.

William IV (r.  1830–1837): “Ten kids, one actress, and then—surprise—respectability”

Before he became king, William IV lived for years with the actor Dorothea Jordan and had ten children with her.  Ten!  If you’ve ever wondered how royals manage “spares,” William took a… generous interpretation of the concept.

Then he became king and, like a man who suddenly realized the portrait painter had arrived, he pivoted into legitimacy and public duty.  Not exactly a scandal machine during his short reign—but the prequel season was a doozy.

Victoria (r.  1837–1901): “Make it moral, make it domestic, make it an empire”

Victoria is the monarch most associated with respectability—partly because she and Albert made a persuasive brand out of family life.  If the Georgians felt like a champagne spill, Victoria felt like a starched tablecloth.

That said, the Victorian era did have its murmurs: intense grief, intense attachments (Hello, John Brown), and a public image so carefully stage-managed that it practically invented modern monarchy PR.  If Victoria had a scandal, it was the quiet kind: feelings that were not filed in triplicate.

Edward VII (r.  1901–1910): “When your coronation follows your social calendar”

Edward VII spent most of his long apprenticeship as Prince of Wales treating the throne like a distant inheritance and London society like an all-you-can-eat buffet with a dress code. His “Marlborough House set” ran on racing, cards, weekend house parties, and adultery so routine a schedule might as well have been printed on the invitations.

In September 1890, Bertie turned up at a country-house party at Tranby Croft and did what he loved best: played baccarat, a game that was, inconveniently, technically illegal, especially when played for stakes by the glitterati.  When a guest, Sir William Gordon-Cumming, was accused of cheating, the solution was pure high-society logic: don’t investigate too hard—stage a hush deal.  Gordon-Cumming was pressured into signing a written pledge that he’d never play cards again, and the Prince of Wales obligingly signed, too, as if the heir to the throne were endorsing a royal non-disclosure agreement on a tapestry-covered card table.

Naturally, this secrecy popped like a champagne cork.  Gordon-Cumming sued, and in 1891, the heir to the throne was hauled into court as a witness—an event that generated exactly the kind of “fashionable matinée” atmosphere that screams useless monarchy.  Gordon-Cumming lost, his life was effectively socially and professionally detonated, and Bertie walked away with a fresh layer of public disgust, because nothing says “future national figurehead” like getting caught in the blast radius of a rigged secrecy pact.

This was not a man who dabbled. He acquired official mistresses with the kind of regularity with which other people acquired umbrellas. Lillie Langtry became his first publicly acknowledged mistress in the late 1870s, and society treated this as news, not a shock. Daisy Greville, Countess of Warwick, later became his “official” mistress (and she was eventually replaced by Alice Keppel), as though the position came with a job description, and a handover memo.

And when the gossip columns needed a courtroom sequel, they got one: in 1870, the Prince of Wales was dragged into the Mordaunt divorce scandal, subpoenaed to testify, and forced to deny—on the record—that anything “improper” had happened. The court applauded, which is a very Victorian way of saying, “We absolutely came for the mess.”

Queen Victoria, meanwhile, regarded Bertie’s appetites as a personal affront to both morality and monarchy.  The distress caused by his affairs hit the family hard, and Victoria’s grief after Albert’s death curdled into lasting bitterness toward her heir.  She wrote, memorably, that much as she pitied him, she could not look at him “without a shudder”—which is about as close as you get to a royal parenting review in one line.

George V (r.  1910–1936): “The serious one, starring in a family of chaos”

George V is often remembered as dutiful, conventional, and sturdily monogamous—the monarchy’s answer to, “Can we please just run the country without a subplot?”

Unfortunately, the universe heard this and responded by giving him children and relatives with… plot.  Which leads us to—

Edward VIII (r.  1936): “Speedrun monarchy”

Edward VIII reigned less than a year but managed to deliver one of the biggest royal crises of the modern era: abdication to marry Wallis Simpson, a twice-divorced American, in a time when that collided spectacularly with monarchy, church, and politics.  Even after his abdication, Edward managed to create new scandals, at one time plotting with Hitler to serve as puppet king in a postwar England.

This wasn’t “randy” so much as “romantic defiance with constitutional consequences.” Still, if you’re grading royal scandals on impact, this one is a platinum medal.

George VI (r.  1936–1952): “Stability, courage, and no time for nonsense”

George VI is the emergency replacement monarch who turned out to be exactly what Britain needed during WWII: steady, hardworking, and personally respectable, with a marriage that projected partnership rather than chaos.

If you’re hunting for scandal, this reign will disappoint you.  Its drama was national, not tabloid: war, duty, health, and the weight of a job he never wanted.  The juiciest thing about George VI is that he makes people feel bad for ever enjoying the gossip in the first place.  (Well, there is that story about smoking three packs of cigarettes a day resulting in having a lung removed in an operation at home…)

åElizabeth II (r.  1952–2022): “The longest reign, the largest scrapbook”

Elizabeth II’s personal life, by royal standards, was famously restrained—yet her reign became a museum of modern scandal simply because it lasted so long, and because the press got louder, faster, and more hungry.

Her “royal scandal” chapter is not so much “the Queen did what?!” but more of a series of her family’s private lives showing up on the evening news:

  • family marriages cracking under public pressure,
  • the media turning private misery into public sport,
  • and the monarchy learning that cameras don’t blink, and tabloids don’t forgive.

If earlier monarchs had scandals as events, Elizabeth II had scandal as weather—rolling in, blowing through, and occasionally taking the roof off the gazebo.

Charles III (r.  2022– ): “The sequel nobody expected to be this complicated”

Charles III arrived on the throne with a backstory already widely known: a long, messy, very public romantic history that played out over decades, and a modern monarchy trying to look timeless while living in real time.

If you want “randy,” this reign’s reputation is mostly inherited from Charles’s years as Prince of Wales—proof that in royal life, your prequel can dominate your present.  As king, the challenge is less romance than management: family narratives, rebuilding public trust, and the small task of being a symbol in an age that distrusts symbols.

Which brings us back to Prince Andrew, who proves that selecting leaders by birth order has never once produced a slow-motion fiasco.  If your family business is literally hereditary symbolism, it’s only a matter of time before one member treats the whole operation like a private club with unlimited guest passes and no bouncer.   Andrew’s modern masterpiece was the attempt to talk his way out of trouble on television—an interview that didn’t so much “clear the air” as replace it with a thick, lingering fog of disbelief—followed by the palace doing what it does best: removing uniforms, patronages, and public duties in the calm, administrative tone usually reserved for reassigning a problematic office printer.

And then the plot did what royal plots always do: it escalated.  Andrew Mountbatten-Windsor was arrested.  If the Georgians gave us mistresses with titles and the Edwardians gave us baccarat in the drawing room, the 21st century gives us the inevitable endpoint of a system that foolishly breeds for primogeniture instead of judgment: not scandal as naughty gossip, but scandal as paperwork, police statements, and the monarchy discovering—again—that “born to it” is not the same thing as “good at it.”

Saturday, February 7, 2026

A Look Back at the Bush Plan

Every few years, someone drags Social Security out onto the national stage, shines a harsh spotlight on it, and announces—usually with the calm confidence of a man explaining compound interest to a golden retriever—that what the program really needs is a makeover involving Wall Street.

In the mid-2000s, President Bush floated the idea of “personal accounts,” and the discussion quickly collapsed into a familiar shouting match:

  • Republicans heard, “You can own your retirement!”
  • Democrats heard, “They’re going to turn Grandma’s check into a day-trading app!”

The partisan rhetoric doomed the proposed plan before it had any chance to be seriously studied or modeled.  In the last two decades there has been no serious discussion about changing the Social Security system even though we all know it has serious problems.

It is roughly 17 years since the Bush Plan would have been in effect, so let’s do something unfashionable: let’s assume everyone remains calm, lower the volume, and walk through what the proposal actually meant, what it would have meant financially, and what it might mean for the average retiree today if the diverted money had been invested in a plain-vanilla S&P 500 index fund.

This is a blog, not a dissertation, so I’m going to keep the math honest, but not joyless.  So, what was the plan?

The popular memory is that Bush wanted to “privatize half” of Social Security.  The most concrete version of the plan that got widely modeled wasn’t “half.”  It was more like: “You may divert a small slice.”

In the most widely analyzed design, the personal account would be funded by diverting up to 4% of the payroll tax, subject to a dollar cap that started at around $1,000 a year and rose over time.  In other words, it was a limited diversion and not a full-blown transfer of the whole program into your Fidelity login.

That detail matters, because it means the personal account was never going to grow into a yacht for the “average” worker… More like a modest financial dinghy—possibly a very nice dinghy—depending on the ocean.

This is an important detail: Your Social Security check goes down if you opt in.  That’s the part that gets lost in the political bumper-sticker version.  If you divert payroll taxes into a personal account, your traditional Social Security benefit gets reduced.  Not because the government is being mean, but because you didn’t pay those taxes into the system, so you don’t get paid as if you did.

The personal account in our model is that all the funds diverted go into an S&P Index Fund that grows or declines with the stock market.  At your retirement, your personal account pays you something too, and your total retirement income becomes:

    (Smaller Social Security check) + (Personal account payment) = Total

So, yes, your Social Security check would be smaller.  The question is whether the personal account would make up the difference, and then some.

Now, the big hypothetical: What if the personal account was invested in an S&P 500 index fund?

The scenario we’re using

  • We’re looking at a new retiree in 2026 (turning 65 and retiring around now) who earned an average income each year.
  • They opted into the personal account in 2009 and contributed the maximum allowed each year under the capped design.
  • The money was invested in a low-cost S&P 500 index fund, with a small annual fee assumption (think “boring and responsible,” not “crypto enthusiast at 2:00 a.m.”).

What happens to the monthly Social Security check?

Under this scenario, the offset would reduce the retiree’s monthly Social Security check by about:

  • $276 to $284 per month (in today’s dollars)

So, if someone says, “Privatization would raise your Social Security check,” the polite reply is: No.  It lowers the check, and then adds a second check.

What does the personal account pay per month?

Under the same scenario, the personal account would generate about:

  • $530 to $593 per month (in today’s dollars)

This is where the stock market does its dramatic entrance, wearing sequins.

Net result: total monthly retirement income.  Put the two together:

  • Social Security check goes down: –$276 to –$284
  •  Personal account adds income: +$530 to +$593
  •  Net change: +$255 to +$309 per month

So, in this specific “average new retiree in 2026” scenario, the retiree’s total monthly income would likely be higher than under current law—by a few hundred dollars a month.  And remember, this is what would happen if we diverted only a small portion of the funds into the private sector.

That’s real money.  It’s not a second home in Aspen, but it’s also not “nothing.”

But wait: does that mean the plan “solves” Social Security?  No, and this is where the policy conversation gets slippery.  The trust fund problem doesn’t vanish; it shape-shifts.

Social Security’s financing challenge is largely a question of cash flow: payroll taxes come in, benefits go out, and demographics are doing what demographics always do—namely, refusing to ask permission as they run roughshod over your plans.

If you divert payroll taxes into personal accounts, the trust funds receive less money up front.  But retirees still need to be paid their benefits during the transition.  That creates transition costs.  In normal-person terms, it means:

The government either:

  • borrows,
  •  raises other taxes,
  •  cuts benefits,
  •  or it does some mixture of all three,

to keep sending checks while part of the payroll tax stream is being rerouted.  The original Bush plan was to divert funds from the general fund into Social Security to match what was being diverted.  If this wasn’t done, the Social Security trust fund would be in worse financial shape than it currently is.  If Congress fails to divert funds, the trust fund, already in terrible financial shape, gets worse.

Could the long-term picture improve if the offset is structured a certain way, participation is limited, benefit growth is changed, or additional financing is added?  Yes.  But personal accounts by themselves are not a magic wand that makes arithmetic stop being arithmetic.

The inheritance question: What could the average retiree leave to heirs?

Here’s where personal accounts do something traditional Social Security generally does not: they can create a pile of money with your name on it.  Under the present system, if a retiree dies after receiving benefits for only one month, his family receives a one-time death benefit of $255.  The rest of the money the retiree paid into the system vanishes.

Under the same scenario, the personal account at retirement would be about:

  • $95,000 to $106,000 (in today’s dollars)

Since this money is in a private account, the total funds would be available to the retiree’s family upon the death of the retiree if the retiree opted to only receive the interest off the fund and not spend the principle.

In other words, personal accounts introduce a new freedom: you can choose a higher monthly income now, or a larger bequest later.  Social Security, as designed, is much more “lifetime insurance” than “inheritable asset.”

So, was the Bush plan a good idea?

The honest blog answer is: it depends on what you’re optimizing for, and how lucky you get.

Potential upside:

  • A strong market period could boost total retirement income for average retirees.
  •  Personal accounts can create inheritable wealth, especially for people who die earlier, or who don’t spend down the account.
  •  People like owning things with their name on them.  This is not a trivial political fact.

Potential downside:

  • Market risk becomes retirement risk.  If the market does badly during your contribution years, or right before retirement, your “private” portion shrinks, but the offset doesn’t sympathetically shrink at the same pace.
  •  The transition financing is real, and it can increase federal borrowing pressures in the years it matters most politically (which is to say, all years ending in a number).

The takeaway, in plain English.  If an average new retiree in 2026 had been allowed to divert the maximum under the capped design, and that money had tracked an S&P 500 index fund through the 2009–2025 market run, then:

  • Their Social Security check would likely be about $280/month smaller,
  •  Their personal account would likely add about $560/month,
  •  Their total monthly income would likely be about $255–$309/month higher, and
  •  They might have something like $95,000 to $106,000 in “surplus” account value that could be preserved for heirs, if they didn’t spend it.

That’s the sunny version, because the stock market in that period was, frankly, in a good mood.  The darker version is the one nobody can calculate cleanly ahead of time: what happens when the market is not in a good mood, but you are still trying to pay rent.

And that, right there, is why this debate never dies: it’s a tug-of-war between the appeal of ownership, the comfort of insurance, and the unavoidable fact that the future is going to do whatever it wants, regardless of our spreadsheets.

Saturday, January 31, 2026

With Foot Firmly in Mouth

There is more than one kind of history. 

One is the kind where armies march, treaties are signed, and professors write books with subtitles like A Reconsideration of the Strategic Context.  Another kind is where someone opens their mouth, a word wobbles slightly to the left, and the whole world decides that a head of state has just confessed to being a pastry.

Let’s begin with the most famous baked good in the history of diplomacy, .a story every reader has heard whether they wanted to or not.

In 1963, President John F.  Kennedy stood in West Berlin and delivered one of the great Cold War soundbites: Ich bin ein Berliner. The line was meant as solidarity: I am a Berliner, i.e., I am one of you, i.e., your cause is my cause.

Then, somewhere along the way, English speakers turned it into: “I am a jelly donut.”

This is one of those stories that refuses to die because it is perfectly shaped to satisfy a certain human need: the need to see the mighty humbled by a small linguistic banana peel.  Presidents, after all, should not be permitted to stride through history like marble statues.  We prefer them with a bit of powdered sugar on the lapel and their foot, if not in their mouth, at least in a bucket.

The problem is that this joke is largely an urban legend.  In Berlin, the pastry you and I might call a “Berliner” is commonly called something else, and, more importantly, Kennedy’s phrasing is defensible in context.  In other words, it worked as intended for the people listening, which is the whole point of communication, and also the whole reason this story is annoying.

But the legend persists because it highlights something true: language is treacherous, even when it’s not technically wrong.  The ear wants what the ear wants, and the public loves a translation that produces an accidental confession of being a donut.

And once you realize that, you start noticing how often world events hang on the fragile thread of words.

Now and then, the translation isn’t a charming myth.  Sometimes it is real, and sometimes it is magnificent in its wrongness.

In 1977, President Jimmy Carter visited Poland, and his remarks were translated into Polish in a way that turned perfectly normal diplomatic sentiments into something… spicier.  Accounts vary in the exact phrasing, but the gist is that the President meant to say that he liked the Poles, but the translation said that he had a more intimate yearning for the Polish people that no president should ever express in public, at least not without a slow jazz soundtrack and a licensing agreement.  Perhaps the verbal slip would not have been so funny if Carter hadn’t just the year before during an interview with Playboy Magazine hadn’t admitted that he had “lust in his heart.”

This is not merely funny: in-on  it is instructive.  Translation is not a word-for-word substitution game.  It is real-time cognitive gymnastics performed in front of an audience, with the added delight that the audience will only notice you exist when you fail.

Interpreters are like football referees: if you’re talking about them, something has gone wrong.  And this brings us to the first great law of international communication: The more important the moment, the more it depends on the least glamorous person in the room.  Which is a comforting thought—unless, of course, you are the least glamorous person in the room.

If Carter’s Polish mishap was the diplomatic equivalent of slipping on a banana peel, Nikita Khrushchev’s most famous line was more like slipping on a banana peel while enthusiastically waving a lit road flare.

In 1956, less than a year after the Soviet Union had crushed the Hungarian Revolution—at a moment when the Cold War was running particularly hot—Khrushchev managed to assure the West that history itself would be doing the burying.

“We will bury you” landed in Western ears like a threat engraved on a missile.  It sounded very much as if the Soviets were advancing with a gun in one hand and a shovel in the other and saw no reason to be subtle about either.

But Russian idiom does not always map neatly onto English panic.  Many have argued that a more accurate sense was something closer to: we will outlast you, or we will live to see you buried, or history will bury your system.  It’s still not exactly a Hallmark card, but it’s a different species of menace—more ideological boasting than literal burial arrangements.

Here’s the point: idioms are loaded weapons.  In your own language, they’re harmless because everyone knows the safety is on.  In another language, they can go off in the translator’s hands and put a hole through the wall.

So, if you’re ever tempted to spice up a diplomatic message with a colorful figure of speech, remember: what’s clever in one tongue can become nuclear in another.

Some translation stories are funny, some are scary, and some are both, depending on how much you enjoy contemplating the fragility of civilization.

In late July 1945, the Allied Powers issued the Potsdam Declaration, citing the generous terms for Japan to surrender and end World War II.  In Japan, many of the top leaders, including Emperor Hirohito, were inclined to accept the terms subject to a few clarifications, but the response included the work mokusatsu,  黙殺; lit. "killing with silence". 

While exactly what the Japanese meant will be argued forever, it is possible that Japan meant to imply “acceptance without comment.”  There is no doubt however how that the United States interpreted it as “rejection by ignoring.”  The mokusatsu episode is often incorrectly retold as though Japan insulted the United States and the United States responded with atomic weapons. 

The United States did not decide to use the atomic bomb because of the Japanese response, instead they saw no reason to stop the already in motion plan to use the nuclear weapons.  By late July 1945, the machinery of war was no longer waiting to be offended—it was waiting for a surrender.

Still, it is a sobering reminder that ambiguity is not neutral.  When you are speaking to someone armed, nervous, and already halfway convinced you mean the worst, an ambiguous word is a match tossed near gasoline.

In everyday life, ambiguity is charming.  It makes poetry possible.  In geopolitics, ambiguity can become a Rorschach test that the other side fills in with their nightmares.

At this point you may be thinking: why does the jelly donut story outlive the truth? Why does “we will bury you” echo louder than the nuance? Why do we cherish linguistic bloopers?

Because these stories serve three human cravings:

1.They make the powerful relatable.  A president who can accidentally call himself a pastry becomes, briefly, the kind of person who also once walked into the wrong restroom.

2.They offer the comfort of comedy.  History is terrifying.  We like our terror with a punchline, preferably one involving baked goods.

3.They warn us without preaching.  “Be careful with language” is boring advice.  “A mistranslation can make you a donut in front of the world” is advice you’ll remember.

And there’s another reason, too: these stories highlight the old truth that language is not a transparent window.  It’s a stained-glass mosaic of culture, habit, and assumption.  You can see through it, but the colors distort everything.

If you take anything from these tales, let it be gratitude for the people who stand between leaders and international chaos.

A skilled interpreter is not “fluent.” They are a professional mind-reader who processes meaning, tone, and intent, while also anticipating how a phrase will land in the other culture.  They are constantly choosing between “literal” and “faithful,” knowing that those are often enemies.

Sometimes the faithful translation sounds less dramatic than the literal one, and the press will punish you for it.  Sometimes the literal translation is accurate in words but disastrous in meaning, and history will punish you for that, too.

In other words, it’s a job where the only way to win is to disappear.

Which brings us, at last, to the king who could not disappear if he tried, because he was, allegedly, a rabbit.

In 1806, Napoleon installed his brother Louis as King of Holland.  Louis, apparently attempting to be charming, tried to address the Dutch in their own language.  The Dutch word for king is koning.  The word for rabbit is konijn.  Those words are close enough that, in the mouth of a nervous foreign monarch, one can hop into the place of the other.

Thus the famous anecdote: instead of saying something like “I am your king,” Louis effectively announced: “I am your rabbit.”

Whether he said it exactly that way, or whether the story grew in the retelling, is part of its charm.  These lines often do grow, because people love them and repeat them, and repetition turns a wink into a brass plaque.

But the underlying truth is timeless: learning a language is an act of humility, and humility is not a natural posture for emperors, kings, and presidents.  When they attempt it anyway, the universe occasionally rewards the effort with a joke that lasts two centuries.

And honestly? Of all the things a ruler might accidentally confess to being, a rabbit is not the worst. 

Saturday, January 24, 2026

Free Trade

Trade is in the news again, with just today, several politicians—including our president—saying that the US needs tariffs to restrict trade.  It is relatively easy to conclude that trade is evil, but nothing could be further from the truth.

In a perfect world, every country would embrace free trade, resulting in a happier and more prosperous world.  And in a perfect world, I would be out putting on my private 36-hole golf course—not writing this blog.  Since the world isn’t perfect and several nations are imposing strict tariffs on imported American goods, our president may be correct: perhaps we do need to adjust our tariffs, if only to encourage other countries to return to a policy of free trade.

We need to teach college students the principles of free trade for the same reason universities make students take freshman orientation: because a shocking number of intelligent people can still be trusted to do something spectacularly counterproductive if no one explains the basics to them.  It is only through ignorance that someone would voluntarily accept the collective warmth of huddled masses.  

Free trade is simply the scandalous idea that consenting adults should be allowed to swap what they have for what they want, without a parade of gatekeepers, forms, committees, and a policy memo citing “stakeholders.”  It turns “this is useless to me” into “this might be perfect for you,” and it turns “you have what I want” into “let’s negotiate,” instead of “let’s regulate each other into mutual disappointment.”  

Voluntary exchange scales beautifully, bureaucratic micromanagement does not, and the fastest way to make everyone worse off is to appoint the most self-righteous person in the seminar as the Director of Fairness and let them decide who deserves the snacks.

If you teach school, there’s a simple, low-effort way to show students why free trade keeps getting invited back into the conversation, even after everyone swears they’re done with it.  It takes about ten minutes, requires no permission slips, and will cost you roughly thirty dollars, which is also known as “two-thirds of a teacher’s weekly budget for joy.”

Step one: go to the nearest dollar store and buy thirty random items.  Candy, a puzzle book, a wine glass, a bag of potato chips, and a tiny flashlight that will stop working the moment it feels unappreciated.  The details do not matter.  (In fact, the more chaotic the assortment, the better.)  You want a table that looks like a garage sale held during a mild panic.

Hand out the items so that every student gets one.  Tell them they may inspect their items, but not open them, consume them, or break them.  (In other words, you are running the most realistic economy imaginable.) They may, however, show them off to classmates, which will immediately create envy, disappointment, and the first great moral lesson of the day.

Now have everyone hold up one to five fingers to show how satisfied they are, with one meaning “this is basically trash” and five meaning “this fulfills my wildest dreams and I would like to name it.”  Mentally add up the scores.  In most classrooms, the total will land somewhere around 40 to 50, because fate has distributed the wine glass to the kid who wants gummy worms, and the gummy worms to the kid who wanted literally anything else.

Round two: announce they may trade, but only with the student sitting next to them.  Give it a minute.  Then have them rate their happiness again.  The total will usually climb into the 60–80 range.  It turns out that when you loosen restrictions a little, people can undo a little of the universe’s bad decision-making.

Round three: remove the training wheels.  Tell them they may make their final trade with any willing student in the room.  Two minutes of frantic swapping later, do the finger-rating one more time.  Typically, the total jumps to 100+, and the classroom briefly resembles a commodities exchange, minus the suits and with more arguing over sour candy.

Then you deliver the punchline: no new items entered the room.  Nobody manufactured anything.  Nobody discovered a gold mine behind the whiteboard.  And yet, the class’s total “wealth”—measured as “how much people value what they have”—went up.  Why? Because trade helps stuff move from the people who don’t want it to the people who do.  And the fewer the restrictions, the easier that happens.

In other words: trade creates wealth—not by making more things, but by getting the same things into better hands.

Trade does more than shuffle stuff around until everyone is happier with what they’re holding. It also does something quietly civilizing: it teaches respect for private property.

Start with the classroom experiment.  Thirty random dollar-store items get scattered among thirty students, and the universe instantly proves it has a sense of humor.  Somebody gets a puzzle book and wants candy.  Somebody gets candy and wants anything that would not melt in their backpack.  At first, the room is full of low-grade disappointment, plus that one student who is absurdly thrilled to receive a tiny panda plush, because the world is unfair in both directions.

Now, here’s the key point: the moment you announce that trading is allowed, the entire tone changes.  Students stop talking like pirates and start talking like shopkeepers.  They ask, “What do you want for that?” instead of, “Hand it over.”  They begin to persuade.  They bargain.  They look for mutual advantage.  They also discover very quickly that the whole game collapses if people don’t treat possession as legitimate.

Because trade only works if your item is actually yours.

In the first round, students are told they can examine the item, show it off, and complain loudly about it, but they cannot open it, eat it, or break it.  That restriction is not there to ruin anyone’s fun.  It is there because private property is not just the right to hold something; it is the responsibility not to destroy what you might later exchange.  A candy bar is a tradable asset until you bite it… After that, it is just evidence.

Then come the trades.  Watch what students do when they want something.  They don’t snatch it, they negotiate for it.  They offer something in return, and—this part is crucial—they accept “no” as a valid answer.  Not always happily, but they accept it.  They start to understand that consent is not a decorative extra—it is the foundation.

Private property sounds lofty until you realize it’s the only barrier between “exchange” and “chaos.”  If anyone can take what they like, there is no reason to offer a better deal, no reason to keep promises, and no reason to plan.  Everyone’s effort goes into guarding, hiding, and grabbing.  In other words, the classroom turns from an economy into a feeding frenzy.

Trade is a practical lesson in boundaries because it teaches that ownership matters, not because the object is sacred, but because respecting ownership is what makes cooperation possible.  When people can say, “This is mine,” and have it mean something, they can also say, “Let’s make a deal,” and have that mean something, too.

While trade creates wealth, it also creates something harder to measure and—arguably—:more important: the habit of respecting other people’s rights, because it turns out that a peaceful “swap” is a lot more profitable than a messy “take.”

Respect for private property is not just a domestic virtue:  scaled up, it is one of the quiet foundations of peace among nations.

When a country treats property rights as legitimate—contracts honored, assets not arbitrarily seized, rules applied predictably—it becomes a safer place for foreigners to buy, sell, invest, and cooperate.  That, in turn, predictability lowers the temptation to use political pressure, covert coercion, or military force to “secure” resources that could be obtained through normal commerce.  In plain terms: if you can reliably purchase what you need, you are less likely to try to take it.

Property rights also strengthen diplomacy because they make agreements credible.  Treaties, trade deals, and cross-border projects are all just contracts with flags on them.  If a government has a reputation for confiscation, default, or expropriation, other states treat promises as temporary and hedge accordingly—often by building exclusionary alliances, by stockpiling weapons, and by preparing for confrontation.

Finally, respect for property supports trade, and trade creates mutual stakes.  When businesses, workers, and consumers in two countries find mutual benefit from ongoing exchange, leaders pay a higher political price for conflict that disrupts it.

Nations that can trust boundaries—territorial and economic—find fewer reasons to test them violently and find a whole lot of good reasons to respect them.