Sunday, July 2, 2023

Trying to Make a Dream “Work”

July 4th (this country’s “birthday”) will be held this coming Tuesday, and the divisions in this country are a main topic of discussion. 
We’re careful to not discuss politics at work, or when getting a hair cut, because we don’t want conflict.
Sometimes, we don’t look forward to big parties where we might meet “unfamiliar” people who we don’t wish to offend.
Most of us get all our news from specific websites and TV channels, and seldom use others.
A few of us long for the days when it seemed that everyone was on the same page, but that time has never existed.
Leading the U.S. has often felt like herding cats.
It certainly felt that way to George Washington.

When George Washington (father of our country) became the first president in 1789, there were only 11 states.
Two of the original thirteen colonies (North Carolina and Rhode Island) hadn’t officially joined the union.
Vermont was toying with joining Canada.
Tennessee was a territory of North Carolina and Kentucky was a county of Virginia.
Great Britain was still holding onto most of the fur trading posts around the Great Lakes.
The eleven states were upset about issues in the Constitution, water rights involving Spain and their borders.
By the end of Washington’s second term, 16 states had accepted the Constitution, and the trading posts were in American hands.

In George Washington’s Farewell Address (1783), issued* when he retired as the first U.S. President, he said that he hoped our “union and brotherly affection may be perpetual” and “that the free constitution, which is the work of your hands, may be sacredly maintained.”
Essentially, Washington was warning the young nation (white men over 16) that only unity could prevent them from splintering into many parts, and from “cunning, ambitious, and unprincipled men” usurping “the very engines, which have lifted them to unjust dominion.” 

One issue that Washington didn’t mention in his Address was the practice of slavery.
In fact, he never spoke out publicly against slavery during his lifetime.
However, he did leave a will which would emancipate the 123 slaves that he could legally free.
(Under the property laws of the time, although Washington was one of the wealthiest men in America, he was only allowed to free 123 of the 317 slaves at the Mount Vernon plantation.
The other 104 slaves were either owned by his wife’s heirs, or leased from neighbors.)

A George Washington commemorative pitcher from the early 1800s.

By the time Washington died in December of 1799, he’d become a cherished symbol of democracy and unity.
Manufacturers around the world (including France and China), supplied commemorative items—plates, pitchers, clocks, and jewelry.
American women sewed memorial pillows, quilts, and black armbands.
American artists created paintings and lithographs that were displayed in homes and public places.
His Farewell Address was read aloud from books like Caleb Bingham’s The Columbian Orator.
His widow (Martha) was besieged for locks of his hair.
However, there wasn’t much discussion as to why Washington (after years of talking about it to his family) had finally freed 123 slaves. 

A year later, minister and bookseller Mason Locke Weems published The Life and Memorable Actions of George Washington, and this dubious biography further helped bolster his image.
The 5th edition of Parson Weem’s biography added several apocryphal stories about Washington—for example, the infamous “cherry tree” tale (in which virtuous young George cannot tell a lie).
(Weems also wrote pamphlets against gambling, dueling, and drinking.
It was his opinion that over-the-top language, sentimentality, and colorful anecdotes,
sold books.)
The Weem’s article in Collier’s Encyclopedia mentions an estimate that his books sold over a million copies.

George Washington’s birthday (February 22) was proclaimed a holiday in 1885.
In 1971, the holiday was moved to the third Monday in February and some states began to call the day “Presidents Day”—a date when two presidents could be celebrated—Washington and Lincoln.
Worship for Washington, as the symbol of unity, had nearly evaporated by 1971, and he was replaced with the country’s martyr for unity, President Abraham Lincoln. 

A banner of Washington displayed by German American Bund at a 1939 rally at Madison Square Garden.

In the late 1930’s, however, George Washington’s “brand” was still strong enough for the German American Bund to use the first president to put a pro-American veneer on Nazism.
The Bund displayed a giant portrait of Washington (alongside Adolf Hitler) at their events—for example, a February 20, 1939 rally in New York’s Madison Square Garden.
(More than 20,000 attended the event, but as Mayor Fiorello LaGuardia predicted, the rally only discredited the group.)

According to Francois Furstenberg—in his  2006 book In the Name of the Father—Parson Weem’s biography of Washington, and Bingham’s The Columbian Orator, were both strong influences on young Abraham Lincoln.
These were two of the books that the self-educated man read by candlelight in his log cabin, and they helped to shape his deep dedication to the union. 

Scene from The Day the Earth Stood Still, showing Klaatu (Michael Rennie) visiting the Lincoln Memorial with Bobby (Billy Gray).

In the 1951 film The Day the Earth Stood Still, a visitor from a more-advanced planet, Klaatu (Michael Rennie) visits the Lincoln Memorial with earth boy Bobby Benson (Billy Gray).
The alien ambassador Klaatu is so impressed by the ideals in Lincoln’s “Gettysburg Address” that he decides to lecture to Earth representatives, and delay destroying the planet.

When Washington spoke for unity in his Farewell Address, there was no way for him to predict how much the country would change over the next 240 years.
The country already looked very divided to him—with the Federalists favoring a strong central government, and Anti-federalists advocating for states’ rights and a Bill of Rights.
The thirteen colonies were very religiously diverse—especially in the middle colonies.
The societies in the south, the New England states, and the middle states were already becoming vastly different.
(In fact, they may be more similar today than they were then.)
However, no one could predict the growth of giant cities, the fact that the U.S. would change from an enclave to a world power, or the internet.
By the first census in 1790 there were nearly 700,000 slaves in the U.S. (17.8 percent, out of 3.9 million people).
Only around 80,000 people lived in the three biggest cities (New York, Philadelphia, and Boston) and most Americans lived in small towns and on farms. 

There’s no record of Washington ever saying that racial divisions might split the nation in two.
He, and the rest of the Founding Fathers didn’t listen to abolitionists—Lafayette, Dr. Filippo (Philip) Mazzei, Tadeusz (Thaddeus) Kosciuszko, and Thomas Paine, to name a few—who advocated against slavery.
Some Founding Fathers, like Washington, said (privately) in their old age, that they were disgusted by the monetizing of human beings.
(Martha Washington burned all her husband’s letters to her.)

However, these great problem-solvers found that issue unsolvable.

240 years later, it's still a battle between those who want a strong central government, and those who want states rights. It’s still a clash between those who want an aristocracy of the rich, and those who want democracy. It’s still a struggle between those who want a caste system, and those who want a society where everyone receives equal respect.

In H.G. Well’s Outline of History (the “New Democratic Republics” chapter), Wells speculates on the statement (in the Declaration of Independence), that “all men are created equal.”
He says:“All men are not born equal, they are born. . .in an ancient and complex social net.”
He goes on to describe the democratic ideals of the Founding Fathers as the human spirit rebelling against the social net, and exhibiting the belief that we may “achieve a new and better sort of civilization that should also be a community of will.” 

In order to achieve democracy, and a “community of will,” the main thing we need is an acknowledged set of ideals.
Washington served for years, as an icon to create a “community of will.”
Abraham Lincoln fulfilled that role after George Washington’s influence ended.
A problem now is that we have no role models “to look up to.”
We’ve given up on political ideals.
Nihilism, cynicism and anarchy will get us nowhere.
Can we maintain a democracy any other way?

*Washington’s Farewell Address wasn’t an oration. The 6,085-word statement was given to Claypool’s American Daily Advertiser (based in Philadelphia, PA) to be published in its’ Monday Edition. The Address was soon circulated in a dozen other papers, 40 pamphlets, posters in public spaces, and books (like Bingham’s The Columbian Orator) all over the nation.


Sunday, June 25, 2023

Replacing the Real World


Photos of Guy Henry being morphed into Peter Cushing (as Grand Moff Tarkin) for Rogue One: A Star Wars Story.

In 2016, I watched a plasticized (CGI) version of Peter Cushing playing Governor Tarkin in Rogue One: A Star Wars Story—twenty-two years after the great Peter Cushing had died.
Actor Guy Henry was used for Cushing’s body and voice; but the CGI turning him into Cushing wasn’t perfect—especially with skin texture.
I’ve long been a fan of Peter Cushing so seeing a weird mixture of the two actors on the screen—neither Guy Henry, or Peter Cushing—was very sad.
Ever since Rogue One, I’ve been interested in the idea of film studios using an actor’s persona, years after the actor is able to perform.

I’m not sure how many directors and film studios would say (on the record) that they agree with Director Alfred Hitchcock (1899-1980) that actors are “cattle.”
Actually, Hitchcock said that what he really meant was that actors should be willing to be “utilized and wholly integrated” into a director’s vision.
(He was also quoted as saying that “Walt Disney was smart for making his actors out of paper since he had the luxury to tear them up when he didn’t like them.”)
I’m sure that Hitchcock would have loved CGI, and I think the impulse to create performances using special effects, originates in this viewpoint—that actors are simply “tools,” rather than collaborators.

Actors dying during shoots has thrown a wrench into film productions.
In 1936, 26-year-old Jean Harlow was working on two films—Double Wedding and Saratoga—when she died from a gallbladder infection that became septic.
Double Wedding had just begun, so her footage was discarded, and the role was recast.
However, 90% of Saratoga had been filmed, so all Harlow’s remaining scenes were filmed using two actresses—one for her body (shot from the back), and another for her voice.
45-year-old Tyrone Powers had the title role in Solomon and Sheba (1959) when he had a heart attack—hours after a strenuous dueling scene (for which he did eight takes).
Yul Brynner was quickly offered the role of “Solomon.”
The Crow (1994) was about two weeks away from being finished, when young star Brandon Lee was accidentally shot on set.*
A body double (with Lee’s face digitally added), was used to complete the film.
Veteran actor Oliver Reed was filming The Gladiator (2000) when he died of a heart attack off set.
Director Ridley Scott altered the plot, and used a body double plus CGI, to complete filming.
According to IMDb, dealing with Reed’s death added about three million to the 100 million film budget.

Before CGI, genre actors (like Boris Karloff and Bela Lugosi) endured long, painful hours in make-up chairs—endangering their health, and (perhaps) making old age harder to endure.
Today, in the Avatar movies, “performance capture”—green dots on the actors’ facial muscles—is used to turn actors into the Na’vi race.
Buddy Ebsen was set to play the “Tinman” in 1939’s The Wizard of Oz, but—after aluminum dust in the facial make-up made him seriously ill—he lost the role.

Kim Hunter in the make-up chair for the first Planet of the Ape film (from 1967).

Kim Hunter was so claustrophobic that she found the application of prosthetics intolerable.She needed a daily Valium to play intelligent ape “Zira” in 1968’s Planet of the Apes, 1970’s Beneath the Planet of the Apes, and 1971’s Escape from the Planet of the Apes.

Alan Cummings, with a make-up artist, during filming of X-Men United. (The character "Nightcrawler" decorated his blue skin with angelic tattoos.)

In the early days of CGI (2003), Alan Cummings was “Nightcrawler” in X2: X-Men United.
Cummings still sat in a make-up chair for five hours per day.
However, removing the caustic blue facial makeup was painful, and injured his skin. 


Christian Bale in 2002 (before losing weight for 2004’s The Machinist), and after losing 63 pounds for the role.

Besides turning actors into fantastic creatures, CGI also changes the appearance of actors in less extreme ways—making them older, younger, weigh more, or weigh less.
Film franchises are filmed over long periods of time.
If special effect artists didn’t de-age or age the actors, the directors would need to recast.
Robert DeNiro gained sixty pounds to play an elderly “Jake LaMotta” in 1980’s Raging Bull.
Christian Bale lost sixty-three pounds for 2004’s The Machinist.
(A little CGI would have gone a long way in not forcing these actors to put on, or lose, so many dangerous pounds.) 


Old effects in original Star Trek episode "The Doomsday Machine" on the left, and redone effects on the right.

Another issue about special effects is that they are always improving.
In 2006, Paramount redid all the special effects in the original Star Trek TV series—replacing star ship miniatures and painted backdrops with CGI.
The 79 episodes looked great for a while, but now  (in 2023) the revised effects look quaint.
Should artists update the effects in Star Trek every ten years, in order to keep up with the technology?
It’s unlikely that Paramount would make back the money.

Both the real and "reel" Barbara Jean Trenton (Ida Lupino) react to someone entering her private screening room in “The Sixteen-Millimeter Shrine.”

Writers have long been fascinated by the idea of living on in the artificial world of movies (be they celluloid or digital).
In the 1959 Twilight Zone TV episode “The Sixteen-Millimeter Shrine” Ida Lupino played “Barbara Jean Trenton”—a famous actress who chooses existence in her celluloid past.
In the 2013 film The Congress, a film studio buys Robin Wright’s acting persona, and uses the digital “Robin” for a science fiction franchise called Rebel Robot Robin.
(In a scene in The Congress, “Robin,” her lawyer, her agent, and the studio CEO debate whether her contract should allow science-fiction films.)
True to the Isaac Asimov books, the main character of Apple TV+ and Asimov’s Foundation (2021-2023) is a digital recreation—Dr. Hari Seldon (with Jared Harris playing both the real and digital, Dr. Seldon).
In this genre TV series, mathematician Dr. Seldon dies in the year 12,069, but 34 years after his death—and for generations after—his digitized recreation interacts with his followers.
(Season Two of Foundation will premiere on July 14th, 2023.)

In summary, I think it’s fine to use technology to modify actors, or use them in roles they’ve chosen...as long as it’s with their consent, and the limitations are spelled out.
Using CGI and performance capture saves performers from spending hours in make-up chairs, as well as makes unnecessary the health risks of putting on extra pounds, or dieting to starvation.
It’s also OK (with the consent of the heirs), to complete scenes in a movie, in the event that the actor is incapacitated.
(In that situation, I would hope that the heirs had final approval on the new scenes.)
However, I would draw the line with using digital images of an actor, to “create a role,” or to sell a product, after an actor has lost the ability to give consent.
Some actors may choose to sign contracts before they pass on, but I believe the wonderful Robin Williams had the right idea.
Williams made sure legally that his likeness could not be used until 1939—twenty-five years after his death.
100 years would be preferable.

*According to The Hollywood Book of Death, by James Robert Parish, “when the gun was reloaded after the close-up shot [of Brandon Lee], the metal tip had remained behind the gun’s cylinder. When the blank went off, it was speculated, the explosive force propelled the dummy tip through the gun barrel and lodged it in Brandon’s body near his spine.” (Brandon Lee was the only son of martial artist and film actor, Bruce Lee.)


Saturday, June 17, 2023

Censorship—Rearranging the Deck Chairs on the Societal Titanic

I was born in the early 1950’s—known as an era of conformity.
Yet, I was allowed to check out books on the adult floor (with my father’s library card) from middle school onward.
I remember checking out Ralph Ellison’s 1952 book Invisible Man—confusing it with H.G. Well’s science fiction classic—and being so distressed at one scene that I felt the book was burning my hands.
A few years later, I checked out James Joyce’s Ulysses (an under the counter book), and found James Joyce as difficult to comprehend as Ralph Ellison.
Reading books that are “too sophisticated,” doesn’t contaminate children, or make them grow up faster.
When you have no context to understand content, it simply goes over your head.
Your primary beliefs about life are instilled by your parents, and (although neither of my parents graduated from college), my parents always encouraged me to read. 

Another book that confused me was our Family Bible.
What could “her flowers be upon him” (Leviticus 15:24) possibly be describing?
Why did men have multiple wives?
Why was it so wrong for a woman to stop two men from fighting by touching a man’s “secrets” (Deuteronomy 25:11)?
What could Ham seeing Noah’s “nakedness” mean?
Actually, I don’t recall asking my parents for answers to these questions.
I was too embarrassed.

During my high school years, our textbooks didn't contain much material about the Reconstruction era, slavery, or even why the U.S. was fighting the Vietnam War.
My high school history department filled in the gaps.
One of my history teachers told us about Ku Klux Klan activity in northwestern Indiana (when he was a child), and lynching.
Another teacher talked about discrimination, and why using derogatory terms for ethnic groups was wrong.
The head of the department explained that the U.S. was fighting the Vietnam War for economic reasons, not just to prevent the spread of communism.
(Thank you, gentlemen of the history department.
You weren’t afraid of doing your jobs, and preparing your students for life.) 

The Florida state legislature’s “Stop Woke Act” states that media specialists should avoid any material that may provoke feelings of “guilt, anguish, or other forms of psychological distress” in children.
However, there’s no accompanying research proving that children experienced guilt after reading Florida textbooks, or that this “distress” would damage children.
Why would learning about discrimination, or Black history, inspire feelings of distress?
Obviously, the Florida law is just a smoke screen for trying to control societal change.


Cover of Grimm's Fairy Tales, published around 1922.

The original children’s stories—fairy tales—were designed to cause some “psychological distress” in children.
In “Hansel and Gretel,” a witch is baked alive in an oven.
In the original ending of “Little Red Riding Hood,” both the grandmother, and the child, are eaten by the wolf.
Until Disney Studios created their versions of “The Little Mermaid,” the mermaid always died at the end.
There’s a long history of using folk and fairy stories to both entertain children, and teach moral lessons.
However, after Maurice Sendak’s Where the Wild Things Are was published in 1963, it took two years before public libraries would place the picture book on their shelves.
Although the book was about children mastering feelings of jealousy and fear, media specialists considered the book too scary for children.

Until the late 1700s, children—poor children, at least—were just considered inexpensive sources of labor.
Then, around 1790, the Romantic movement began, and the Western world began to view children as “pure and untainted beings”—at least those lucky enough to be born in wealthier families.
It’s in this era that brother and sister, Charles and Mary Lamb,* co-authored Tales from Shakespeare (1807). This collection of twenty stories—derived from twenty Shakespeare plays—was intended to be “appropriate for young people.” The first edition sold out, and it’s been in print ever since. Kathy Watson’s biography of Mary Lamb states that when there were plot issues “that might seem indecent for young people, she [Mary] simply changed them.”
(Charles and Mary Lamb never found life partners, and lived in “double singleness” for most of their lives.)

During the early 1800’s (says Kathy Watson), children’s literature was an “interesting battlefield” for “philosophers, churchmen, teachers, and parents.”
It was a battle between “romanticists” (like the Lambs) and “educationalists,” like Sarah Trimmer (editor of The Guardian of Education, a periodical published from 1802-1806).
Trimmer mistrusted fairytales—because they were frightening and worked “too powerfully upon the feelings of the mind.”
This “media specialist” was especially disdainful of “Cinderella,” because that story “encouraged a disturbing love of finery.”

Cover of Frederic Wertham’s Seduction of the Innocent, a British edition.

Almost 150 years later, psychiatrist Frederic Wertham (1895-1981) wrote Seduction of the Innocent.
His theory was that seeing violence and sexuality in comic books caused delinquency in children.
(His ideas resulted in the Comics Code Authority.)
I wonder what Dr. Wertham would think of the Disney channel, or the fact that superhero films are the top film franchises worldwide?

Cover of a British edition of Fahrenheit 451, by Ray Bradbury.

Any discussion of censorship wouldn’t be complete without mentioning Ray Bradbury’s book Fahrenheit 451—the science fiction classic about people fighting against a totalitarian government, that sets fire to libraries and suppresses ideas.
(The title refers to the temperature at which book paper catches fire.)
A passage reads: 

So now do you see why books are hated and feared? They show the pores in the face of life. The comfortable people want only wax moon faces, poreless, hairless, expressionless. We are living in a time when flowers are trying to live on flowers, instead of growing on good rain and black loam. Even fireworks, for all their prettiness, come from the chemistry of the earth. Yet somehow we think we can grow, feeding on flowers and fireworks, without completing the cycle back to reality. . .

Those who want to “protect societal values” by not reading books, and those who want to “promote inclusivity” by deleting words from books, are both rearranging deck chairs on the Titanic.
According to the American Library Association, there were 1,269 demands to censor library books in 2022—the highest number in over 20 years, and nearly double the 2021 demands.
Publishers are changing the word “fat” to “enormous” in Willie Wonka—instead of just adding a good explanation of Roald Dahl’s “world view” in the front matter.
At the same time all this censorship is going on, several states have changed voting laws so less people can vote, and a United Nations report (June 12, 2023) stated that there has been ”no improvement in biases against women in a decade.”
Changing a few words in classic books, and banning “progressive” books, will not create societal change.
In order to make real change, we need to look elsewhere.

*Mary Lamb (1764-1847) suffered from mental illness. In 1796—ten years before she co-wrote Tales from Shakespeare—Mary stabbed her mother to death in a fit of rage. Her brother Charles became her caretaker, and she was his housekeeper, for most of her life. Two biographies—Kathy Watson’s The Devil Kissed Her, and Susan Tyler Hitchcock’s Mad Mary Lamb—tell her story.

Saturday, June 10, 2023

“Woke” or “Anti-woke”: What Does ChatGPT Say?

I wanted to clarify (in my own mind) what it means to be “woke” or “anti-woke,” and how censorship relates to both concepts.
While it’s hard to agree on what these words describe, many are convinced that they’re either one or the other.
First, I wrote about 500 words.
Then, I downloaded ChatGPT, to see where the app took me.
To my surprise, ChatGPT was useful.
The ChatGPT text was as dry and colorless as expected.
Moreover, the essays were full of platitudes, and contained falsehoods.
However, ChatGPT did offer some ideas that I hadn’t thought of.

When I asked ChatGPT to define “woke,” I received an essay saying (in part) that:

To be woke signifies a deep commitment to social consciousness and active engagement with systemic injustices. . . and a willingness to challenge the status quo. 

The significance of being woke lies in its potential to bring about meaningful change, fostering a more inclusive, equitable, and just society for all.

When I asked ChatGPT for a definition of “anti-woke,” I received an essay saying that:

Being anti-woke represents a rejection of what critics perceive as the dogmatic and stifling aspects of the woke ideology.

It emphasizes individual responsibility, intellectual diversity, and the importance of free thought and expression.

While acknowledging the existence of systemic injustices, the anti-woke movement encourages a more nuanced and balanced approach to social issues . . . and open dialogue.

When I asked whether censorship was an issue on the political right or left, I received an essay saying that:

Censorship is not inherently a political right or left issue, but rather a complex and multifaceted dilemma that intersects with various ideologies.

While the political right may often advocate for censorship to protect societal values, the political left may prioritize censorship as a tool to combat discrimination and promote inclusivity. 

The ChatGPT response to my request for an essay on “culture wars” was mere gobbledygook.
It ended with the sentence: “Only through a commitment to mutual respect and a willingness to confront these issues head-on can the United States move towards healing and reconciliation.”
Well, that’s completely useless!

I would disagree with the ChatGPT definition of “woke” in several ways.
“To be woke” doesn’t always mean “a commitment to changing the status quo.”
Often, it just means being a bit more open to societal change.
“Woke” people are usually more open to erasing words like “master bedroom” from their vocabularies, using personal pronouns in their email signatures, and being more aware of microaggressions.
Often, it only means that the “woke” are more willing to face uncomfortable information, and learn from history.

I would also argue with the ChatGPT definition of “anti-woke.”
While “being woke” is perceived by the anti-woke as dogmatic, it’s difficult to figure out which beliefs are actually in contention.
It’s as if the perceived attitudes of self-satisfaction in the woke, are more distressing than their actual ideas.
“Collective guilt” and “cancel culture” came up in the ChatGPT essay, but I’m sure that only a small percentage of “the woke” feel guilt.
Further, the woke are more likely to cancel people on their side, than the anti-woke.
(Think of comedian Kathy Griffin and former Senator Al Franken.)
I also wonder what percentage of the anti-woke “acknowledge the existence of systemic injustices,” or desire an “open dialogue” (as suggested by ChatGPT).
Overall, being anti-woke may only mean that you are unhappy with the speed of, or existence of, societal change, or that you find “woke” people annoying self-righteous.

I was very happy with the ChatGPT response on censorship.
Saying that the political right wants to “protect societal values,” while the political left wants to “combat discrimination and promote inclusivity” just about sums it up.
However, everyone has their own thoughts about what our societal values should be, which words are good or bad in promoting inclusivity, and whether “words” are important in this task.

Front cover for the paperback version of Casino Royale by Ian Fleming (published under the name You Asked for It by Popular Library in 1953).

Back cover for You Asked for It.
President John F. Kennedy was a big fan of the James Bond spy-thrillers (oddly called Jimmy Bond on this back cover).
However, JFK likely read the hardcover versions.

In order to “promote inclusivity,” the publisher of the late Ronald Dahl recently produced two different versions of James and the Giant Peach—changing “Cloud-men” to “Cloud-people” (among other changes) in their Puffin version—and keeping “Cloud-men” in the classic Penguin version.
The spy-thrillers of Ian Fleming, and the mysteries of Agatha Christie, underwent a similar process.

Lobby card for Gone with the Wind with house servant Mammy (Hattie McDaniel) tying the girdle (or stays) of Scarlett O’Hara (Vivian Leigh). Hattie McDaniel received an Oscar for Best Actress in a Supporting Role, for playing Mammy.

Combating racial, and other types of discrimination, through “sanitizing,” or even cancelling works, isn’t new.
I remember debates in the 1970’s about whether 1939’s Gone with the Wind should be banned.
Disney’s 1946 blend of live-action and animation, Song of the South,* isn’t considered “appropriate in today’s world,” and hasn’t been seen on home video legally since 1986.
Some Warner Brother cartoons (like “Herr and Hare” and ”Daffy-the-Commando,” produced as propaganda between 1941-1945) were restored and rereleased—along with a lengthy disclaimer—in 2008.
(Volume 6 of the Looney Tunes Golden Collection.)
However, some of the more racially-insensitive 1930’s and World War II cartoons (for example, ”Tokio Jokio”) will likely never see the light of day—at least, legally.

Meantime—in order to ”protect societal values”—U.S. school boards are removing classic children’s books (like Charlotte’s Web and A Wrinkle in Time) from their school library shelves.
(I mention Charlotte’s Web and A Wrinkle in Time because these were two of my favorites.)
I looked up why one parent group proposed removing 1952’s Charlotte’s Web, and the parents disliked characters dying, and thought that “talking animals” were “disrespectful to God.”
A Wrinkle in Time (1962) was criticized for “promoting witchcraft.”
I have fond memories of both books.
I remember my 4th grade school teacher, Mrs. Simmons, reading Charlotte’s Web aloud to us.
(I adored Mrs. Simmons.)
I checked out A Wrinkle in Time from our public library during the 1960’s, and ended up reading every other book I could find by Madeleine L’Engle.

Is it “woke” to buy a children’s book like 2005’s And Tango Makes Three—a story about two male penguins who help raise a chick together—in order to foster a more inclusive society?
Is it “anti-woke” to ask that And Tango Makes Three be removed from your public library, so that children won’t be influenced to accept homosexuality as normal?
In the end, I agree with those who support parents not allowing their children to read certain books, but not the right to deny librarian-approved books to others. 

Uncle Remus and Brer Rabbit cover.
It’s believed that Beatrix Potter based her Peter Rabbit stories on Uncle Remus.

*Song of the South was based on the once well-known Uncle Remus stories. The folklorist/author was Joel Chandler Harris (1848-1908), a white journalist. Harris wrote down the Br’er Rabbit and Br’er Fox tales after listening to African folk tales told by former slaves—primarily, George Terrell. According to the Atlanta Journal Constitution (11/2/2006), Disney Studios purchased the film rights for Song of the South from the Harris family in 1939, for $10,000—the equivalent of about $218,246.76 today.

Friday, June 2, 2023

Truth Versus Truthiness

Only those born before 1980 are considered digital natives.
I’m classified as a digital immigrant, because I was born around the same year as the Ferrante Mark I—the world’s first general-purpose computer.
I grew up in the days of rotary phones and three TV networks, and I didn’t own a personal computer until the early 1990’s, when I purchased my first OS6 Macintosh. 

I went through a phase when I played games on my Mac, but I only liked clue-finding games.
I never devised an avatar, or played computer games with people around the world.
I met my husband at an office for freelancers—where people could get their resumes typed and use drop off boxes—not via Hinge or Tinder.
My only avatars are the sticker emojis I made on my iphone, and the self-portrait I cobbled together from ready-made choices for Facebook.
My favorite Apps are IMDb, the FoodNetwork, YouTube, and Goodreads—sites where I can look up information or be entertained, not communicate with others.
I’m definitely a digital immigrant.

Most of the science-fiction I read is old—very old.
I enjoy rereading authors that I first read in the 1970’s—among them, Isaac Asimov, Primo Levi and Olaf Stapledon.
Authors have been discussing the ideas of whether artificial beings should be legal “persons,” or whether artificial intelligence will supersede humans, for a very long time.

In order to broaden my horizons, I decided to read more recent science-fiction, and I happened upon The Lifecycle of Software Objects by Ted Chiang.
According to Wikipedia, Chiang isn’t a digital native either.
(He was born in 1967.)
However, he’s won four Nebula Awards and four Hugo Awards; and he writes philosophical science-fiction—my favorite category.

Just as Karel Capek invented the word “robot” for his 1921 play R.U.R. (Rossum’s Universal Robots), Chiang invented the term “digient” for “digital entities.”
In The Lifecycle of Software Objects, digients are virtual pets created as past-times for wealthy customers, who are then expected to parent them.
The novella is the story of two central characters (Ana and Derek), and their digients (Jax, and siblings Marco and Polo).
At the beginning of the story, Ana and Derek both work for a company that creates and sells digients.
After the software company goes bankrupt, Ana and Derek opt to take over the care of their favorite digital entities, so that their cherished entities may “live.” 

Chiang deals with both psychological and philosophical issues in The Lifecycle of Software Objects.
The principal subject is raising and educating the “infant” digients.
He further mentions that digients are equipped with “pain circuit breakers,” so they’ll be “immune to torture,” and thus “unappealing to sadists,” bringing up the fact that sociopaths will still be a societal problem in the near future.
I especially enjoyed the message board sequences in which obviously “bad” parents grouse about their “bad” digient children.

At one point in the story, the digient siblings, Marco and Polo, ask to be “rolled back” to an earlier point in their “lives,” because they’re unable to resolve an argument.
Is it right for “daddy” Derek to allow this; or should he force his “children” to work out their own disagreements, so that they may grow emotionally?
Later, Marco and Polo ask to become corporations, or legal persons.
Should Derek permit this?
Is it child abuse to separate a digient from its’ friends, and fan clubs, or to alter its’ programming so that it can become a sex slave?

Scene of Robbie the Robot disabling the weapons of  “Doc” Ostrow (Warren Stevens) and Commander Adams (Leslie Nielsen) in 1956’s Forbidden Planet.

In Chiang’s novella, Ana disagrees with a company that wants her to help train a digient that “responds like a person, but isn’t owed the same obligations as a person.” [Italics mine.]
This scene reminded me of two TV series in which androids/robots are traumatized—The Orville (2017-?) and Westworld (2016-2022).
In season two of The Orville, we learn about the history of the Kaylons—a society of sentient artificial lifeforms (created as slaves) who exterminated the biologicals who created them.
In Westworld, the first season begins with human-like androids being the prey of depraved humans, but by season four it’s all-out war between androids and humans—that seems to end on earth in the same result as on planet Kaylon.

The central question is whether it’s ethical to enslave a sentient being—be it a virtual entity, robot, android, or human.
Is enslaving non-biologicals just as wrong as enslaving a fellow biological?
In a world in which human life is less important than money, is it senseless to worry about the treatment of virtual or robotic creatures?
After all, while many of us say we believe in fair play, unselfishness, and truthfulness; almost no one thinks we should carry through with these beliefs in our daily lives.

Scene of Charly Burke (Anne Winters) talking to the Kaylon Isaac (Mark Jackson) in The Orville episode “Electric Sheep.”

People can justify any bad action, as long as it makes them feel better.
We can justify not paying back a loan because the lender has more money in the bank than the lendee.
We can justify breaking laws, because other people are more corrupt—the classic pot calling the kettle black.
Few believe that the way to build a life is to be honest and truthful all the time.
Some of my favorite novels on this subject are not science-fiction.
(I recommend two Fyodor Dostoevsky novels—The Idiot, and Demons, also titled The Possessed.)

Because people can justify any bad action, the erosion of generally-believed truths is quite dangerous for society.
A 2016 Sanford study* came to the conclusion that digital natives are unable to judge the credibility of online information, or distinguish between an advertisement and a news story.
The inability to tell truth from truthiness (on the web) is also evident in digital immigrants—perhaps, more so.
In a world where we have no generally believed truths, and we only believe what we want to believe, how is an organized society possible?

We first heard the word “truthiness” on The Colbert Report—Stephen Colbert’s mock news show (which aired from 2005 through 2014), in which he portrayed a far right news personality.
In Colbert’s book America Again (2012), the same character satirically discusses voter fraud (page 165) and goes on to recommend ending voter fraud by ending voter registration (page 166).
Little did anyone think in 2012, that 10 years later, in 2022—40% of us would believe that the 2020 election was illegitimate, or that later several states would actually pass laws making it harder to vote.

Ultimately, the most important conflict is not one between digital natives and digital immigrants, right versus left, the intelligentsia versus average people, or even “woke” versus “anti-woke.”
Instead, I think that the most crucial divide is between people who want to try and seek out truth and reality in this confusing world, and those who prefer living in their cocoons.

*”Evaluating Information: The Cornerstone of Civic Online Reasoning,” by Sam Wineburg, Sarah McGrew, Joel Breakstone and Teresa Ortega (2016) the Stanford History Education Group.

What You Liked Best