Tuesday, July 11, 2023

The Triumph of a Dream Over Reality


An animation cell from The Congress.

We’ve been reading about film studios scanning movie actors, and using their personas in future movies, even after the actor has died.
While researching the subject, I read about The Congress—a 2013 live-action and animated film—about an actor who’s being scanned for future use.
We found the DVD among our unwatched movies collection, and watched it.
(An added incentive was that the film is loosely based on Stanislaw Lem’s Futurological Congress.)
This article is a comparison of the 1971 science fiction book, to the 2013 genre film, and to 2022’s Everything, Everywhere, All at Once.*

A 1980's book cover for Stanislaw Lem’s Futurological Congress.

The protagonist of the 1971 novel Futurological Congress is an academic Ijon Tichy—a delegate to the Eighth World Futurological Congress studying overpopulation.
The site is a 100-story Hilton in Costa Rica, and the story takes place around the year 2000.
While attending seminars, Tichy drinks water from the public supply.
He’s unaware that it’s has been drugged to control the area in and around the hotel (ahead of a revolution) and is caught up in the savagery.
After Tichy drinks the hallucinogenic water, neither he (or the reader) can be sure which sequences are real, and which are not.

The world that Tichy lives in—prior to the revolution, and the hallucinogens—is one of ever-present violence.
A fellow delegate (standing next to Tichy) is shot—for being dark-skinned, and reaching for a handkerchief.
(Tichy washes the blood splatter from his clothes in his hotel room.)
He has a drink at the hotel bar, and learns that a bar companion is a religious sharpshooter who plans on murdering the Pope.
A sign in his room reads: “This Room Guaranteed BOMB-FREE.”
There’s a ten-foot crowbar, a khaki camouflage cape, and an Alpine rope available in this room closet—indicating that “disturbances” may be a common occurrence in this Hilton.

Tichy retreats, with a few other guests, to the basement of the hotel.
Later, he’s evacuated (in two different scenarios) by the U.S. military, and wakes up as a stigmatized “defrostee” in the year 2039.
In this “utopian” world, people are addicted to “psyches”—drugs that regulate every aspect of their lives.
A passage reads:

We live in a psychemized society.
From "psycho-chemical."
Words such as "psychic" or "psychological" are no longer in usage. . .
One should always use the drug appropriate to the occasion.
It will assist, sustain, guide, improve, resolve.
Nor is it it, but rather part of one's own self, much as eyeglasses become in time, which correct defects in vision. 

As one character says: “A dream will always triumph over reality, once it is given the chance.”
A woman tries to help Tichy acclimate to her world, but he becomes disillusioned with their relationship.
He discovers that the 2039 world wasn’t real, and that it’s actually 2098, or is it?

The major roles in The Congress are: Robin Wright (played by Robin Wright); her agent, Al (Harvey Keitel); studio CEO Jeff Green (Danny Huston); her young son, Aaron (Kodi Smit-McPhee); her teenage daughter, Sarah (Sami Gayle); Aaron’s doctor (Paul Giamatti); and Robin’s long-time animator Dylan Truliner (voiced by Jon Hamm).
Even if I weren’t interested in the subject, the performances of the actors—especially the star—made the film well worth watching.
Although the animated half of the film is a bit uneven, most of it is quite beautiful.
(The animation is all hand-drawn.)
The film is directed by Ari Folman, who wrote the screenplay. 

Promo poster for The Congress (with the original title Robin Wright at the Congress) with the emphasis on it as a live-action film.

“Robin Wright” is the protagonist of The Congress film.
She’s playing a variation of herself—much like John Malkovich did in 1999’s Being John Malkovich, and Nicholas Cage in 2022’s The Unbearable Weight of Massive Talent.
She’s an actor—once The Princess Bride, and now a single mother whose son has Usher Syndrome.
Her teenage daughter (Sarah) seems bitter toward her.
Is Sarah so bitter because Robin concentrates so much attention on the son?
Twisted studio head, Jeff Green, offers her much less money than a male star would receive to sell her identity to the studio.
One condition of the sale is that she not perform again, on film or stage.
At first, she refuses, but then—after Aaron’s disease progresses—she agrees to a 20-year contract.

Robin (Robin Wright) in the Digital Domain dome made out of LED lights.

Of the live-action sequences, one of the most psychologically interesting is the scene in which Robin is scanned.
This scene doesn’t take place on a set.
As I learned in the DVD commentary, it was filmed in the actual Digital Domain studios, where actors are scanned for posterity.
She’s first asked to wear a special bodysuit, and then places herself within a giant dome made of LED lights.
Later, Robin is asked to exhibit a series of emotions, but she’s so overwhelmed by feelings of being drained, that she freezes.
Her agent Al (Harvey Keitel) extracts the emotions out of her during a long monologue.
(Her agent has a long history of manipulating her for his own ends.)

Twenty years later, an older live-action Robin drives through a crossing gate, and enters the “animated zone” to meet with smarmy CEO Jeff Green (in an appropriately cockroach-filled room).
(As Robin enters this zone, the Lem book and the film become more similar, and the film switches from live-action to animation.)
Green’s studio has made a tidy profit from Robin’s persona—in a film franchise called Rebel Robot Robin—but “animated Robin” is an ignored guest at the studio’s “Congress” (set in a 100-story hotel).
She hallucinates, and a revolution explodes (as in the book).
Robin escapes to the flooded basement—where after various “rescues”—she awakes from being frozen, and explores a future society.
Her initial companion is Dylan Truliner (voiced by Jon Hamm)—an animator who’s obsessed with her.
Unlike Tichy (in the novel), Robin isn’t just an explorer.
She’s searching for her son, Aaron.
At one point (in the hotel basement), Robin thinks she sees daughter Sarah in a corridor, but she doesn’t pursue her.)

During the hotel scenes of Futurological Congress, Lem uses a lot of familiar names—UPI, Agence France Presse, Interpol, Berkeley University—to signal that we aren’t too far in the future.
Similarly, in both animated worlds of The Congress (the present and future), entities use familiar cartoon avatars of a few years back. There are animated caricatures of famous people (among them: Jesus, Frida Kahlo, Cyndi Lauper, Frank Sinatra, Yoko Ono, and Primo Levi); actors in roles, like Elizabeth Taylor (as Cleopatra) and Clint Eastwood (from The Good, the Bad, and the Ugly); as well as non-people like the “Robot” from Metropolis, and a trio of costumed “Robin Wrights” from Rebel Robot Robin.
Hotel staffers are characters from Max Fleischer cartoons (from the 1930’s and 1940’s).
The desk clerk is a sex doll.
When Robin makes her transition to the future, “Grace Jones” (in her A View to a Kill guise), wakes her from slumber.

The future in the 2013 film is very dark, but it’s not as dark as the two futures in Lem’s Futurological Congress.
After Tichy takes a dose of “up'n'at'm”—a powerful anti-psychem—he realizes he’s not living in a utopia after all.
Instead, the “magnificent hall” where he’s dining is a “concrete bunker,” and the pheasant he’s eating is “unappetizing gray-brown gruel.”
He discovers why people are always gasping for breath.
They’re out of breath from running along the streets, and only believing that they are driving elegant cars: 

Holding their hands out chest-high and gripping the air like children pretending to be drivers, businessmen were trotting single file down the middle of the street. . .
Then the vapor wore off, the picture gave a shudder, straightened out, and once again I was looking down on a gleaming procession of car tops, white, yellow, emerald, moving majestically across Manhattan.

Fifty years later, the 2098 world in Futurological Congress is even uglier—an overpopulated, glacier-filled, dying nightmare.
People aren’t just ordinary-looking; they are deformed—with tails, bristles on their backs, crude artificial limbs, and skin diseases—all “living” in a grotesque world that they are oblivious of.
(I’m not going to spoil the endings of the novel, or the film, but it’s ironic that Robin won’t sign a new contract allowing buyers to become her, but ultimately chooses to exist as another person.)

A French poster that emphasizes that The Congress is an animated film.

The Congress has several elements in common with 2022’s Everything, Everywhere, All at Once*—most importantly, that the focus of both films is on a mother and her child.
Both are genre films with an art house vibe that transport us in and out of different worlds—one a multiverse, and the other various futures.
Both films deal with metaphysical issues, as well as motherhood, feminism, and age.
A central difference is that Everything has a lot more humor than The Congress, and its’ humor has an open-hearted quality.
Most of the humor in The Congress is in the performances of Danny Huston (as CEO Jeff Green) and Harvey Keitel (as Al), but it’s a bitter sort of humor (as we watch both men manipulate Robin).

I suppose I understand why The Congress wasn’t more of a commercial success.
The title is very dull and not evocative.
The film itself is half live-action and half animation (yet clearly for adults).
The “futures” portrayed in The Congress are terrifying, and its’ portrayal of sexism is clear-eyed.
I know that we (an older couple who love genre films) would have gone to see it on a big screen, if we’d heard more about it.
As it was, we picked up the DVD because the cast was so interesting.
The Congress is currently available, on DVD, and streaming.

* Both Michelle Yeoh (the star of Everything, Everywhere, All at Once), and the directors (Daniel Kwan and Daniel Scheinert) have mentioned in interviews that the directors originally wanted the protagonist “Evelyn Wang” to be named “Michelle.” However, they gave up the idea after Michelle Yeoh didn’t agree.

Sunday, July 2, 2023

Trying to Make a Dream “Work”

July 4th (this country’s “birthday”) will be held this coming Tuesday, and the divisions in this country are a main topic of discussion. 
We’re careful to not discuss politics at work, or when getting a hair cut, because we don’t want conflict.
Sometimes, we don’t look forward to big parties where we might meet “unfamiliar” people who we don’t wish to offend.
Most of us get all our news from specific websites and TV channels, and seldom use others.
A few of us long for the days when it seemed that everyone was on the same page, but that time has never existed.
Leading the U.S. has often felt like herding cats.
It certainly felt that way to George Washington.

When George Washington (father of our country) became the first president in 1789, there were only 11 states.
Two of the original thirteen colonies (North Carolina and Rhode Island) hadn’t officially joined the union.
Vermont was toying with joining Canada.
Tennessee was a territory of North Carolina and Kentucky was a county of Virginia.
Great Britain was still holding onto most of the fur trading posts around the Great Lakes.
The eleven states were upset about issues in the Constitution, water rights involving Spain and their borders.
By the end of Washington’s second term, 16 states had accepted the Constitution, and the trading posts were in American hands.

In George Washington’s Farewell Address (1783), issued* when he retired as the first U.S. President, he said that he hoped our “union and brotherly affection may be perpetual” and “that the free constitution, which is the work of your hands, may be sacredly maintained.”
Essentially, Washington was warning the young nation (white men over 16) that only unity could prevent them from splintering into many parts, and from “cunning, ambitious, and unprincipled men” usurping “the very engines, which have lifted them to unjust dominion.” 

One issue that Washington didn’t mention in his Address was the practice of slavery.
In fact, he never spoke out publicly against slavery during his lifetime.
However, he did leave a will which would emancipate the 123 slaves that he could legally free.
(Under the property laws of the time, although Washington was one of the wealthiest men in America, he was only allowed to free 123 of the 317 slaves at the Mount Vernon plantation.
The other 104 slaves were either owned by his wife’s heirs, or leased from neighbors.)

A George Washington commemorative pitcher from the early 1800s.

By the time Washington died in December of 1799, he’d become a cherished symbol of democracy and unity.
Manufacturers around the world (including France and China), supplied commemorative items—plates, pitchers, clocks, and jewelry.
American women sewed memorial pillows, quilts, and black armbands.
American artists created paintings and lithographs that were displayed in homes and public places.
His Farewell Address was read aloud from books like Caleb Bingham’s The Columbian Orator.
His widow (Martha) was besieged for locks of his hair.
However, there wasn’t much discussion as to why Washington (after years of talking about it to his family) had finally freed 123 slaves. 

A year later, minister and bookseller Mason Locke Weems published The Life and Memorable Actions of George Washington, and this dubious biography further helped bolster his image.
The 5th edition of Parson Weem’s biography added several apocryphal stories about Washington—for example, the infamous “cherry tree” tale (in which virtuous young George cannot tell a lie).
(Weems also wrote pamphlets against gambling, dueling, and drinking.
It was his opinion that over-the-top language, sentimentality, and colorful anecdotes,
sold books.)
The Weem’s article in Collier’s Encyclopedia mentions an estimate that his books sold over a million copies.

George Washington’s birthday (February 22) was proclaimed a holiday in 1885.
In 1971, the holiday was moved to the third Monday in February and some states began to call the day “Presidents Day”—a date when two presidents could be celebrated—Washington and Lincoln.
Worship for Washington, as the symbol of unity, had nearly evaporated by 1971, and he was replaced with the country’s martyr for unity, President Abraham Lincoln. 

A banner of Washington displayed by German American Bund at a 1939 rally at Madison Square Garden.

In the late 1930’s, however, George Washington’s “brand” was still strong enough for the German American Bund to use the first president to put a pro-American veneer on Nazism.
The Bund displayed a giant portrait of Washington (alongside Adolf Hitler) at their events—for example, a February 20, 1939 rally in New York’s Madison Square Garden.
(More than 20,000 attended the event, but as Mayor Fiorello LaGuardia predicted, the rally only discredited the group.)

According to Francois Furstenberg—in his  2006 book In the Name of the Father—Parson Weem’s biography of Washington, and Bingham’s The Columbian Orator, were both strong influences on young Abraham Lincoln.
These were two of the books that the self-educated man read by candlelight in his log cabin, and they helped to shape his deep dedication to the union. 

Scene from The Day the Earth Stood Still, showing Klaatu (Michael Rennie) visiting the Lincoln Memorial with Bobby (Billy Gray).

In the 1951 film The Day the Earth Stood Still, a visitor from a more-advanced planet, Klaatu (Michael Rennie) visits the Lincoln Memorial with earth boy Bobby Benson (Billy Gray).
The alien ambassador Klaatu is so impressed by the ideals in Lincoln’s “Gettysburg Address” that he decides to lecture to Earth representatives, and delay destroying the planet.

When Washington spoke for unity in his Farewell Address, there was no way for him to predict how much the country would change over the next 240 years.
The country already looked very divided to him—with the Federalists favoring a strong central government, and Anti-federalists advocating for states’ rights and a Bill of Rights.
The thirteen colonies were very religiously diverse—especially in the middle colonies.
The societies in the south, the New England states, and the middle states were already becoming vastly different.
(In fact, they may be more similar today than they were then.)
However, no one could predict the growth of giant cities, the fact that the U.S. would change from an enclave to a world power, or the internet.
By the first census in 1790 there were nearly 700,000 slaves in the U.S. (17.8 percent, out of 3.9 million people).
Only around 80,000 people lived in the three biggest cities (New York, Philadelphia, and Boston) and most Americans lived in small towns and on farms. 

There’s no record of Washington ever saying that racial divisions might split the nation in two.
He, and the rest of the Founding Fathers didn’t listen to abolitionists—Lafayette, Dr. Filippo (Philip) Mazzei, Tadeusz (Thaddeus) Kosciuszko, and Thomas Paine, to name a few—who advocated against slavery.
Some Founding Fathers, like Washington, said (privately) in their old age, that they were disgusted by the monetizing of human beings.
(Martha Washington burned all her husband’s letters to her.)

However, these great problem-solvers found that issue unsolvable.

240 years later, it's still a battle between those who want a strong central government, and those who want states rights. It’s still a clash between those who want an aristocracy of the rich, and those who want democracy. It’s still a struggle between those who want a caste system, and those who want a society where everyone receives equal respect.

In H.G. Well’s Outline of History (the “New Democratic Republics” chapter), Wells speculates on the statement (in the Declaration of Independence), that “all men are created equal.”
He says:“All men are not born equal, they are born. . .in an ancient and complex social net.”
He goes on to describe the democratic ideals of the Founding Fathers as the human spirit rebelling against the social net, and exhibiting the belief that we may “achieve a new and better sort of civilization that should also be a community of will.” 

In order to achieve democracy, and a “community of will,” the main thing we need is an acknowledged set of ideals.
Washington served for years, as an icon to create a “community of will.”
Abraham Lincoln fulfilled that role after George Washington’s influence ended.
A problem now is that we have no role models “to look up to.”
We’ve given up on political ideals.
Nihilism, cynicism and anarchy will get us nowhere.
Can we maintain a democracy any other way?

*Washington’s Farewell Address wasn’t an oration. The 6,085-word statement was given to Claypool’s American Daily Advertiser (based in Philadelphia, PA) to be published in its’ Monday Edition. The Address was soon circulated in a dozen other papers, 40 pamphlets, posters in public spaces, and books (like Bingham’s The Columbian Orator) all over the nation.


Sunday, June 25, 2023

Replacing the Real World


Photos of Guy Henry being morphed into Peter Cushing (as Grand Moff Tarkin) for Rogue One: A Star Wars Story.

In 2016, I watched a plasticized (CGI) version of Peter Cushing playing Governor Tarkin in Rogue One: A Star Wars Story—twenty-two years after the great Peter Cushing had died.
Actor Guy Henry was used for Cushing’s body and voice; but the CGI turning him into Cushing wasn’t perfect—especially with skin texture.
I’ve long been a fan of Peter Cushing so seeing a weird mixture of the two actors on the screen—neither Guy Henry, or Peter Cushing—was very sad.
Ever since Rogue One, I’ve been interested in the idea of film studios using an actor’s persona, years after the actor is able to perform.

I’m not sure how many directors and film studios would say (on the record) that they agree with Director Alfred Hitchcock (1899-1980) that actors are “cattle.”
Actually, Hitchcock said that what he really meant was that actors should be willing to be “utilized and wholly integrated” into a director’s vision.
(He was also quoted as saying that “Walt Disney was smart for making his actors out of paper since he had the luxury to tear them up when he didn’t like them.”)
I’m sure that Hitchcock would have loved CGI, and I think the impulse to create performances using special effects, originates in this viewpoint—that actors are simply “tools,” rather than collaborators.

Actors dying during shoots has thrown a wrench into film productions.
In 1936, 26-year-old Jean Harlow was working on two films—Double Wedding and Saratoga—when she died from a gallbladder infection that became septic.
Double Wedding had just begun, so her footage was discarded, and the role was recast.
However, 90% of Saratoga had been filmed, so all Harlow’s remaining scenes were filmed using two actresses—one for her body (shot from the back), and another for her voice.
45-year-old Tyrone Powers had the title role in Solomon and Sheba (1959) when he had a heart attack—hours after a strenuous dueling scene (for which he did eight takes).
Yul Brynner was quickly offered the role of “Solomon.”
The Crow (1994) was about two weeks away from being finished, when young star Brandon Lee was accidentally shot on set.*
A body double (with Lee’s face digitally added), was used to complete the film.
Veteran actor Oliver Reed was filming The Gladiator (2000) when he died of a heart attack off set.
Director Ridley Scott altered the plot, and used a body double plus CGI, to complete filming.
According to IMDb, dealing with Reed’s death added about three million to the 100 million film budget.

Before CGI, genre actors (like Boris Karloff and Bela Lugosi) endured long, painful hours in make-up chairs—endangering their health, and (perhaps) making old age harder to endure.
Today, in the Avatar movies, “performance capture”—green dots on the actors’ facial muscles—is used to turn actors into the Na’vi race.
Buddy Ebsen was set to play the “Tinman” in 1939’s The Wizard of Oz, but—after aluminum dust in the facial make-up made him seriously ill—he lost the role.

Kim Hunter in the make-up chair for the first Planet of the Ape film (from 1967).

Kim Hunter was so claustrophobic that she found the application of prosthetics intolerable.She needed a daily Valium to play intelligent ape “Zira” in 1968’s Planet of the Apes, 1970’s Beneath the Planet of the Apes, and 1971’s Escape from the Planet of the Apes.

Alan Cummings, with a make-up artist, during filming of X-Men United. (The character "Nightcrawler" decorated his blue skin with angelic tattoos.)

In the early days of CGI (2003), Alan Cummings was “Nightcrawler” in X2: X-Men United.
Cummings still sat in a make-up chair for five hours per day.
However, removing the caustic blue facial makeup was painful, and injured his skin. 


Christian Bale in 2002 (before losing weight for 2004’s The Machinist), and after losing 63 pounds for the role.

Besides turning actors into fantastic creatures, CGI also changes the appearance of actors in less extreme ways—making them older, younger, weigh more, or weigh less.
Film franchises are filmed over long periods of time.
If special effect artists didn’t de-age or age the actors, the directors would need to recast.
Robert DeNiro gained sixty pounds to play an elderly “Jake LaMotta” in 1980’s Raging Bull.
Christian Bale lost sixty-three pounds for 2004’s The Machinist.
(A little CGI would have gone a long way in not forcing these actors to put on, or lose, so many dangerous pounds.) 


Old effects in original Star Trek episode "The Doomsday Machine" on the left, and redone effects on the right.

Another issue about special effects is that they are always improving.
In 2006, Paramount redid all the special effects in the original Star Trek TV series—replacing star ship miniatures and painted backdrops with CGI.
The 79 episodes looked great for a while, but now  (in 2023) the revised effects look quaint.
Should artists update the effects in Star Trek every ten years, in order to keep up with the technology?
It’s unlikely that Paramount would make back the money.

Both the real and "reel" Barbara Jean Trenton (Ida Lupino) react to someone entering her private screening room in “The Sixteen-Millimeter Shrine.”

Writers have long been fascinated by the idea of living on in the artificial world of movies (be they celluloid or digital).
In the 1959 Twilight Zone TV episode “The Sixteen-Millimeter Shrine” Ida Lupino played “Barbara Jean Trenton”—a famous actress who chooses existence in her celluloid past.
In the 2013 film The Congress, a film studio buys Robin Wright’s acting persona, and uses the digital “Robin” for a science fiction franchise called Rebel Robot Robin.
(In a scene in The Congress, “Robin,” her lawyer, her agent, and the studio CEO debate whether her contract should allow science-fiction films.)
True to the Isaac Asimov books, the main character of Apple TV+ and Asimov’s Foundation (2021-2023) is a digital recreation—Dr. Hari Seldon (with Jared Harris playing both the real and digital, Dr. Seldon).
In this genre TV series, mathematician Dr. Seldon dies in the year 12,069, but 34 years after his death—and for generations after—his digitized recreation interacts with his followers.
(Season Two of Foundation will premiere on July 14th, 2023.)

In summary, I think it’s fine to use technology to modify actors, or use them in roles they’ve chosen...as long as it’s with their consent, and the limitations are spelled out.
Using CGI and performance capture saves performers from spending hours in make-up chairs, as well as makes unnecessary the health risks of putting on extra pounds, or dieting to starvation.
It’s also OK (with the consent of the heirs), to complete scenes in a movie, in the event that the actor is incapacitated.
(In that situation, I would hope that the heirs had final approval on the new scenes.)
However, I would draw the line with using digital images of an actor, to “create a role,” or to sell a product, after an actor has lost the ability to give consent.
Some actors may choose to sign contracts before they pass on, but I believe the wonderful Robin Williams had the right idea.
Williams made sure legally that his likeness could not be used until 1939—twenty-five years after his death.
100 years would be preferable.

*According to The Hollywood Book of Death, by James Robert Parish, “when the gun was reloaded after the close-up shot [of Brandon Lee], the metal tip had remained behind the gun’s cylinder. When the blank went off, it was speculated, the explosive force propelled the dummy tip through the gun barrel and lodged it in Brandon’s body near his spine.” (Brandon Lee was the only son of martial artist and film actor, Bruce Lee.)


Saturday, June 17, 2023

Censorship—Rearranging the Deck Chairs on the Societal Titanic

I was born in the early 1950’s—known as an era of conformity.
Yet, I was allowed to check out books on the adult floor (with my father’s library card) from middle school onward.
I remember checking out Ralph Ellison’s 1952 book Invisible Man—confusing it with H.G. Well’s science fiction classic—and being so distressed at one scene that I felt the book was burning my hands.
A few years later, I checked out James Joyce’s Ulysses (an under the counter book), and found James Joyce as difficult to comprehend as Ralph Ellison.
Reading books that are “too sophisticated,” doesn’t contaminate children, or make them grow up faster.
When you have no context to understand content, it simply goes over your head.
Your primary beliefs about life are instilled by your parents, and (although neither of my parents graduated from college), my parents always encouraged me to read. 

Another book that confused me was our Family Bible.
What could “her flowers be upon him” (Leviticus 15:24) possibly be describing?
Why did men have multiple wives?
Why was it so wrong for a woman to stop two men from fighting by touching a man’s “secrets” (Deuteronomy 25:11)?
What could Ham seeing Noah’s “nakedness” mean?
Actually, I don’t recall asking my parents for answers to these questions.
I was too embarrassed.

During my high school years, our textbooks didn't contain much material about the Reconstruction era, slavery, or even why the U.S. was fighting the Vietnam War.
My high school history department filled in the gaps.
One of my history teachers told us about Ku Klux Klan activity in northwestern Indiana (when he was a child), and lynching.
Another teacher talked about discrimination, and why using derogatory terms for ethnic groups was wrong.
The head of the department explained that the U.S. was fighting the Vietnam War for economic reasons, not just to prevent the spread of communism.
(Thank you, gentlemen of the history department.
You weren’t afraid of doing your jobs, and preparing your students for life.) 

The Florida state legislature’s “Stop Woke Act” states that media specialists should avoid any material that may provoke feelings of “guilt, anguish, or other forms of psychological distress” in children.
However, there’s no accompanying research proving that children experienced guilt after reading Florida textbooks, or that this “distress” would damage children.
Why would learning about discrimination, or Black history, inspire feelings of distress?
Obviously, the Florida law is just a smoke screen for trying to control societal change.


Cover of Grimm's Fairy Tales, published around 1922.

The original children’s stories—fairy tales—were designed to cause some “psychological distress” in children.
In “Hansel and Gretel,” a witch is baked alive in an oven.
In the original ending of “Little Red Riding Hood,” both the grandmother, and the child, are eaten by the wolf.
Until Disney Studios created their versions of “The Little Mermaid,” the mermaid always died at the end.
There’s a long history of using folk and fairy stories to both entertain children, and teach moral lessons.
However, after Maurice Sendak’s Where the Wild Things Are was published in 1963, it took two years before public libraries would place the picture book on their shelves.
Although the book was about children mastering feelings of jealousy and fear, media specialists considered the book too scary for children.

Until the late 1700s, children—poor children, at least—were just considered inexpensive sources of labor.
Then, around 1790, the Romantic movement began, and the Western world began to view children as “pure and untainted beings”—at least those lucky enough to be born in wealthier families.
It’s in this era that brother and sister, Charles and Mary Lamb,* co-authored Tales from Shakespeare (1807). This collection of twenty stories—derived from twenty Shakespeare plays—was intended to be “appropriate for young people.” The first edition sold out, and it’s been in print ever since. Kathy Watson’s biography of Mary Lamb states that when there were plot issues “that might seem indecent for young people, she [Mary] simply changed them.”
(Charles and Mary Lamb never found life partners, and lived in “double singleness” for most of their lives.)

During the early 1800’s (says Kathy Watson), children’s literature was an “interesting battlefield” for “philosophers, churchmen, teachers, and parents.”
It was a battle between “romanticists” (like the Lambs) and “educationalists,” like Sarah Trimmer (editor of The Guardian of Education, a periodical published from 1802-1806).
Trimmer mistrusted fairytales—because they were frightening and worked “too powerfully upon the feelings of the mind.”
This “media specialist” was especially disdainful of “Cinderella,” because that story “encouraged a disturbing love of finery.”

Cover of Frederic Wertham’s Seduction of the Innocent, a British edition.

Almost 150 years later, psychiatrist Frederic Wertham (1895-1981) wrote Seduction of the Innocent.
His theory was that seeing violence and sexuality in comic books caused delinquency in children.
(His ideas resulted in the Comics Code Authority.)
I wonder what Dr. Wertham would think of the Disney channel, or the fact that superhero films are the top film franchises worldwide?

Cover of a British edition of Fahrenheit 451, by Ray Bradbury.

Any discussion of censorship wouldn’t be complete without mentioning Ray Bradbury’s book Fahrenheit 451—the science fiction classic about people fighting against a totalitarian government, that sets fire to libraries and suppresses ideas.
(The title refers to the temperature at which book paper catches fire.)
A passage reads: 

So now do you see why books are hated and feared? They show the pores in the face of life. The comfortable people want only wax moon faces, poreless, hairless, expressionless. We are living in a time when flowers are trying to live on flowers, instead of growing on good rain and black loam. Even fireworks, for all their prettiness, come from the chemistry of the earth. Yet somehow we think we can grow, feeding on flowers and fireworks, without completing the cycle back to reality. . .

Those who want to “protect societal values” by not reading books, and those who want to “promote inclusivity” by deleting words from books, are both rearranging deck chairs on the Titanic.
According to the American Library Association, there were 1,269 demands to censor library books in 2022—the highest number in over 20 years, and nearly double the 2021 demands.
Publishers are changing the word “fat” to “enormous” in Willie Wonka—instead of just adding a good explanation of Roald Dahl’s “world view” in the front matter.
At the same time all this censorship is going on, several states have changed voting laws so less people can vote, and a United Nations report (June 12, 2023) stated that there has been ”no improvement in biases against women in a decade.”
Changing a few words in classic books, and banning “progressive” books, will not create societal change.
In order to make real change, we need to look elsewhere.

*Mary Lamb (1764-1847) suffered from mental illness. In 1796—ten years before she co-wrote Tales from Shakespeare—Mary stabbed her mother to death in a fit of rage. Her brother Charles became her caretaker, and she was his housekeeper, for most of her life. Two biographies—Kathy Watson’s The Devil Kissed Her, and Susan Tyler Hitchcock’s Mad Mary Lamb—tell her story.

Saturday, June 10, 2023

“Woke” or “Anti-woke”: What Does ChatGPT Say?

I wanted to clarify (in my own mind) what it means to be “woke” or “anti-woke,” and how censorship relates to both concepts.
While it’s hard to agree on what these words describe, many are convinced that they’re either one or the other.
First, I wrote about 500 words.
Then, I downloaded ChatGPT, to see where the app took me.
To my surprise, ChatGPT was useful.
The ChatGPT text was as dry and colorless as expected.
Moreover, the essays were full of platitudes, and contained falsehoods.
However, ChatGPT did offer some ideas that I hadn’t thought of.

When I asked ChatGPT to define “woke,” I received an essay saying (in part) that:

To be woke signifies a deep commitment to social consciousness and active engagement with systemic injustices. . . and a willingness to challenge the status quo. 

The significance of being woke lies in its potential to bring about meaningful change, fostering a more inclusive, equitable, and just society for all.

When I asked ChatGPT for a definition of “anti-woke,” I received an essay saying that:

Being anti-woke represents a rejection of what critics perceive as the dogmatic and stifling aspects of the woke ideology.

It emphasizes individual responsibility, intellectual diversity, and the importance of free thought and expression.

While acknowledging the existence of systemic injustices, the anti-woke movement encourages a more nuanced and balanced approach to social issues . . . and open dialogue.

When I asked whether censorship was an issue on the political right or left, I received an essay saying that:

Censorship is not inherently a political right or left issue, but rather a complex and multifaceted dilemma that intersects with various ideologies.

While the political right may often advocate for censorship to protect societal values, the political left may prioritize censorship as a tool to combat discrimination and promote inclusivity. 

The ChatGPT response to my request for an essay on “culture wars” was mere gobbledygook.
It ended with the sentence: “Only through a commitment to mutual respect and a willingness to confront these issues head-on can the United States move towards healing and reconciliation.”
Well, that’s completely useless!

I would disagree with the ChatGPT definition of “woke” in several ways.
“To be woke” doesn’t always mean “a commitment to changing the status quo.”
Often, it just means being a bit more open to societal change.
“Woke” people are usually more open to erasing words like “master bedroom” from their vocabularies, using personal pronouns in their email signatures, and being more aware of microaggressions.
Often, it only means that the “woke” are more willing to face uncomfortable information, and learn from history.

I would also argue with the ChatGPT definition of “anti-woke.”
While “being woke” is perceived by the anti-woke as dogmatic, it’s difficult to figure out which beliefs are actually in contention.
It’s as if the perceived attitudes of self-satisfaction in the woke, are more distressing than their actual ideas.
“Collective guilt” and “cancel culture” came up in the ChatGPT essay, but I’m sure that only a small percentage of “the woke” feel guilt.
Further, the woke are more likely to cancel people on their side, than the anti-woke.
(Think of comedian Kathy Griffin and former Senator Al Franken.)
I also wonder what percentage of the anti-woke “acknowledge the existence of systemic injustices,” or desire an “open dialogue” (as suggested by ChatGPT).
Overall, being anti-woke may only mean that you are unhappy with the speed of, or existence of, societal change, or that you find “woke” people annoying self-righteous.

I was very happy with the ChatGPT response on censorship.
Saying that the political right wants to “protect societal values,” while the political left wants to “combat discrimination and promote inclusivity” just about sums it up.
However, everyone has their own thoughts about what our societal values should be, which words are good or bad in promoting inclusivity, and whether “words” are important in this task.

Front cover for the paperback version of Casino Royale by Ian Fleming (published under the name You Asked for It by Popular Library in 1953).

Back cover for You Asked for It.
President John F. Kennedy was a big fan of the James Bond spy-thrillers (oddly called Jimmy Bond on this back cover).
However, JFK likely read the hardcover versions.

In order to “promote inclusivity,” the publisher of the late Ronald Dahl recently produced two different versions of James and the Giant Peach—changing “Cloud-men” to “Cloud-people” (among other changes) in their Puffin version—and keeping “Cloud-men” in the classic Penguin version.
The spy-thrillers of Ian Fleming, and the mysteries of Agatha Christie, underwent a similar process.

Lobby card for Gone with the Wind with house servant Mammy (Hattie McDaniel) tying the girdle (or stays) of Scarlett O’Hara (Vivian Leigh). Hattie McDaniel received an Oscar for Best Actress in a Supporting Role, for playing Mammy.

Combating racial, and other types of discrimination, through “sanitizing,” or even cancelling works, isn’t new.
I remember debates in the 1970’s about whether 1939’s Gone with the Wind should be banned.
Disney’s 1946 blend of live-action and animation, Song of the South,* isn’t considered “appropriate in today’s world,” and hasn’t been seen on home video legally since 1986.
Some Warner Brother cartoons (like “Herr and Hare” and ”Daffy-the-Commando,” produced as propaganda between 1941-1945) were restored and rereleased—along with a lengthy disclaimer—in 2008.
(Volume 6 of the Looney Tunes Golden Collection.)
However, some of the more racially-insensitive 1930’s and World War II cartoons (for example, ”Tokio Jokio”) will likely never see the light of day—at least, legally.

Meantime—in order to ”protect societal values”—U.S. school boards are removing classic children’s books (like Charlotte’s Web and A Wrinkle in Time) from their school library shelves.
(I mention Charlotte’s Web and A Wrinkle in Time because these were two of my favorites.)
I looked up why one parent group proposed removing 1952’s Charlotte’s Web, and the parents disliked characters dying, and thought that “talking animals” were “disrespectful to God.”
A Wrinkle in Time (1962) was criticized for “promoting witchcraft.”
I have fond memories of both books.
I remember my 4th grade school teacher, Mrs. Simmons, reading Charlotte’s Web aloud to us.
(I adored Mrs. Simmons.)
I checked out A Wrinkle in Time from our public library during the 1960’s, and ended up reading every other book I could find by Madeleine L’Engle.

Is it “woke” to buy a children’s book like 2005’s And Tango Makes Three—a story about two male penguins who help raise a chick together—in order to foster a more inclusive society?
Is it “anti-woke” to ask that And Tango Makes Three be removed from your public library, so that children won’t be influenced to accept homosexuality as normal?
In the end, I agree with those who support parents not allowing their children to read certain books, but not the right to deny librarian-approved books to others. 

Uncle Remus and Brer Rabbit cover.
It’s believed that Beatrix Potter based her Peter Rabbit stories on Uncle Remus.

*Song of the South was based on the once well-known Uncle Remus stories. The folklorist/author was Joel Chandler Harris (1848-1908), a white journalist. Harris wrote down the Br’er Rabbit and Br’er Fox tales after listening to African folk tales told by former slaves—primarily, George Terrell. According to the Atlanta Journal Constitution (11/2/2006), Disney Studios purchased the film rights for Song of the South from the Harris family in 1939, for $10,000—the equivalent of about $218,246.76 today.

What You Liked Best