Monday, December 20, 2021

Christmas Myths

Christmas Myths

Christians have been celebrating Christmas for almost 1,900 years. Some who know some of the academic arguments about the origin of Christmas may raise their eyebrows at the claim of 1,900 years since most academics will say that the first celebration of Christmas was in 336 CE. I'm going to address that in this article, but there are actually quite a few myths that go around during Christmas time that have persisted for centuries. Some of these myths come from the academic world and some come from Christianity itself. There are many Christmas myths, but I've narrowed it down to five:

  • “Xmas” does not take Christ out of Christmas
  • December 25th is the first day of Christmas, not the last
  • Christmas and associated celebrations were not adopted from Saturnalia or Sol Invictus
  • Jesus most likely wasn’t born in the winter
  • Jesus' conception was not the Immaculate Conception
  • You’re not a bad Christian if you do or do not celebrate Christmas

·         “Xmas” does not take Christ out of Christmas

X is the first letter for Christ in Greek (Χριστός). Using "X" as an abbreviation for Christ began in the first century when some of the apostles were still alive. If using the letter X in place of Christ was offensive to God, I'm sure the issue would have appeared in the epistles somewhere. The use of "X" as an abbreviation for Christ continued in the Latin church even though "Ch" replaces the X. When there are no printing presses, parchment is expensive, and literally, everything has to be written out by hand, a lot of stuff gets abbreviated except in the most important documents. The first use of the term "Xmas" in reference to "Christmas" was recorded in 1021 by an Anglo Saxon Christian who abbreviated Christmas by replacing the word "Christ" with an abbreviation that had been in accepted use by Christians for more than 900 years.

So, Merry Xmas.

·         December 25th is the first day of Christmas, not the last

Around this time of year, a lot of stores have a "Twelve Days of Christmas" sale that usually starts in the second or third week of December. This has led a lot of people to believe that the 12 days of Christmas begins on December 14th and ends on December 25th.

In western Christianity, a church liturgical calendar came into use in the 5th century and went through a constant evolution. Several of the holidays that Christians celebrate, including Christmas and Easter were already celebrated for a couple hundred years in various regions of western Asia and Europe. As those celebrations came to be commonly accepted holidays within the church, they were added to the church liturgical calendar.

The beginning of the liturgical year begins with the season of Advent, which begins on the fourth Sunday before Christmas. This year, the first Sunday of Advent was November 28th and yesterday, December 19th was the fourth and last Sunday of advent. The last day of Advent is December 24th, Christmas Eve. Then, Christmas, the celebration of the birth of Jesus Christ, begins on December 25th and lasts 12 days, ending on January 5th. January 6th is called Epiphany or Three Kings day, which commemorates the visit of the magi, and officially ends what many call Christmastime (Advent, Christmas, Epiphany).

 ·         Christmas and associated celebrations were not adopted from Saturnalia or Sol Invictus

In the 18th century, the enlightenment encouraged a great deal of anti-religious skepticism. In the west, that meant questioning everything about Christianity. I personally have no problem with that because it doesn't matter what you believe in, you should know what you believe, why you believe it, and how to defend it. The problem was that a lot of enlightenment academics simply grasped at straws or fabricated evidence against Christianity.

Examples of fabricated evidence include Gerald Massey's Christ-Myth theory where he suggests that Jesus not only did not exist but that all of the stories concerning him were borrowed from other sources. The virgin birth came from the myth of Horus and Mithraism. This is erroneous because neither have virgin birth stories. Horus was miraculously born, but his mother, Isis was not a virgin, Horus' conception still involved intercourse, and Mithras was born from a rock.

Skeptics of Christianity could not believe that Christianity could possibly have emerged on its own and looked at the religions popular in the first century and suggested that Christians borrowed their stories, beliefs, and finally, their holidays from other popular religions in the Classical Roman world.

The first holiday skeptics proposed as one of the origins of Christmas is Saturnalia. The problem with suggesting that Saturnalia was influential on Christians is that records show Romans celebrating Saturnalia going back to the second century BCE. Saturnalia celebrations began on December 17th and ended on December 23rd—three days before and after the Roman observance of the winter solstice. Christmas starts on December 25th, five days after the solstice and then lasts twelve days.

Early Christians of the first three to four centuries would not have observed a holiday if it was associated with paganism in any way. If you read the early church fathers, especially during the first three to four hundred years of church history, they constantly warned against joining in pagan celebrations or trying to adopt pagan practices for the church. So, for the first two to three hundred years of Christianity, new Christians were zealously anti-pagan. There was absolutely no record of Christians celebrating anything like Saturnalia until the 4th century and even then, the celebrations were much tamer. Even in the first century there was some controversy over any association with paganism, but more about that in the last Christmas myth. The academics of the 18th and 19th centuries have to ignore this evidence to make their theory work.

The other Roman holiday suggested as the origin for Christmas is the celebration of Sol Invictus. The Roman emperor Antoninus (Elagabalus) tried to introduce Sol Invictus in 218 CE. He failed. Sol Invictus was a celebration of a sun god from the Arabian peninsula, named Elagabal, which is where Antoninus got his more well-known names: Elagabalus and Heliogabalus. The problem was that Romans detested eastern and northern religions until the mid-third century CE. They were too foreign and barbarian. Over the next half-century, several "barbarian" practices became cultural norms, like wearing beards, and allowing eastern religions to be syncretized into the Roman pantheon. Thus, in 274, Emperor Aurelian was finally successful in introducing the cult of Egabal to the Romans calling him Sol Invictus, and made December 25th the festival of Sol Invictus.

Aha! There it is! That's the day of Christmas. Since the first official record in the church of December 25th as the day of Christmas happened in 336 CE, it's clear that Christians borrowed this day from Sol Invictus. If 336 CE was the first time Christmas had been celebrated on December 25th, that might be true. The problem is that in many countries, Christians had unofficially been celebrating Christmas on December 25th going back to at least 150 CE in Antioch (modern-day Antakya, Turkey), 68 years before Elegabulus tried to introduce Sol Invictus and 124 years before Aurelian was successful in proclaiming Sol Invictus.

Scholars easily associate the birth of Christ with the winter solstice because that's when the sun begins to dominate again, days start getting longer, and life begins returning. The problem is that in Christian theology, life does not begin at the time of birth, but at the time of conception. This can be clearly seen in the biblical account of Mary's visit to Elizabeth when John the Baptist leaps in Elizabeth's womb at the presence of the unborn Christ. This is also reflected in early celebrations of the annunciation. The first official mentions of the Festival of the Annunciation appeared in 656, but again, the celebration of the virgin conception of Jesus unofficially began in the second century, at least two decades before the first celebration of Christmas in Antioch. The annunciation has traditionally been celebrated on March 25th for about 1900 years. If you add nine months from March 25th, you land on...

wait for it...

December 25th.

Of course, 18th and 19th-century skeptics didn't care about evidence that predated official church records, and modern skeptics and academics often still don't when it comes to Christianity. In fact, some of this evidence wasn't discovered until the 20th century. In spite of discoveries that show that Christmas on December 25th predates Sol Invictus by decades and has no similarities to Saturnalia, modern skeptics continue to promote this academic myth.

·         Jesus most likely wasn’t born in the winter

The evidence from the Bible suggests that Jesus was born sometime in the spring or early autumn. The reason for placing the annunciation in March and by extension the birth of Christ in December has to do with Rabbinical traditions regarding the date of creation and the Exodus from Egypt that predate Christianity, but that's a whole other subject.

The reason we know Jesus could not have been born in the winter is that the story of his birth includes "shepherds abiding in the fields" (Luke 2:8). Neither the shepherds nor the sheep would have been in the fields from late fall to late winter. Temperatures in and around Bethlehem this time of year are highs in the low 50s and lows in the low 30s Fahrenheit (0°-10°C). 2000 years ago, the overnight lows would easily have been at or below freezing.

  • Jesus' conception was not the Immaculate Conception

The doctrine of the Immaculate Conception and the Virgin Conception are two separate doctrines, the former being rejected by Protestant Christianity. The Immaculate Conception is a doctrine in the Catholic Church, both East and West, that says that in order for Jesus to be born without sin, He could not have been born of a sinful human.

To solve the problem of Mary passing on original sin through her blood, the Catholic Church introduced the idea of Immaculate Conception. The idea was so controversial, even though it was widely believed for over 1,000 years, that it wasn't made an official doctrine of the Catholic Church until 1854 CE. The Immaculate Conception teaches that Mary was excused by God of the stain of original sin at the moment of her conception. So, according to this doctrine, even though Mary was naturally conceived by Anne and Joachim, at the moment of conception, God proclaimed her sinless.

This doctrine developed from a lack of biological knowledge, specifically embryological knowledge going back to Hippocrates in Classical Greece. It was believed for centuries that children received their blood from their mothers. If Jesus received His blood from Mary, then He could not have been sinless or divine, and thus not God in the flesh. It wasn't until the 18th century that scientists discovered that human embryos develop their own blood and this was confirmed in 1901 when blood types were discovered. Because of this, even early protestants, like Martin Luther did not fully reject the Immaculate Conception but modified it.

Most protestant denominations reject the Immaculate Conception of Mary today, because the doctrine of the Virgin Birth perfectly solves the original sin problem from the biological perspective. While Jesus was born of a virgin, and His genetic information may have even been similar to Mary's, He did not need her genetic information, much less her blood, in order to be conceived and born.

·         You’re not a bad Christian if you do or do not celebrate Christmas

Many Christian cults, such as the Jehovah’s Witnesses and some churches and Christians in fundamentalist branches of orthodox denominations, have abandoned Christmas and all of its trappings because they have bought into the skeptic claims regarding Christmas and many other Christian holidays. It doesn’t help that academics, even Christian apologists, have missed out on some of the evidence I’ve presented above. If you use an internet search engine to research the origins of Christmas, you will have to dig deep to find any mention of the celebration in Antioch in the mid-second century or its relation to the dating of the annunciation. This cements the idea of Christmas being pagan in the minds of some Christians, and something to be avoided.

There are three passages from the Bible that I would refer to and then a logical argument.

First, the argument over whether Christians should celebrate any holiday or participate in non-Christian holidays is addressed by the issues covered in Romans 14, 1 Corinthians 8, and Colossians 2.

In Romans 14, Paul addresses Christians that are taking issue with each other because of “doubtful disputations.” Evidently, Christians in Rome were criticizing one another for matters that had nothing to do with doctrine or morals. There were Christians in the early church who chose a vegetarian diet. This was likely due to most meat available in public markets having been first sacrificed to pagan idols and many Christians thought they would be associating with pagan practices by eating meat offered to idols. Paul addresses this same issue in more detail in 1 Corinthians 8. In both places, Paul makes it clear that Christians are free to eat whatever they want and are even free to restrict their diet if eating certain things made them feel guilty because of pagan associations (Rom. 14:2, 1 Cor. 8:7-10). Paul said that is something we shouldn’t judge others about (Rom. 14:13), nor should we be concerned about others judging us (Col. 2:16). He gives other examples, too: Judging someone’s servant (Rom. 14:3), and giving precedence or celebrating certain days (Rom. 14:5, Col. 2:16).

The issue of whether someone chooses to celebrate Christmas or not falls under that latter point. Basically, if you choose to celebrate Christmas and you’re doing it to glorify God, then it’s fine. If you don’t celebrate Christmas because your faith suffers due to alleged associations with paganism, that’s also fine. If you do celebrate Christmas, you have no right to condemn those who don’t. If you don’t celebrate Christmas, you have no right to condemn those that do. Period. While it was in vogue to avoid any appearance of association with paganism during the period of many of the early Church Fathers, Paul is pretty clear that buying meat offered to idols isn’t a big deal, theologically, and other pagan associations like “esteeming” certain days above others also don’t matter either.

Now for the logical argument. Let’s say for argument’s sake that it could be definitively established that all popular Christian holidays have pagan origins. First, I would appeal back to the biblical precedent above. Second, I would ask a few questions.

1.       How many people are even remotely aware or even care about Christmas’ alleged pagan origins? Very few. Outside of Christianity, only the most dedicated skeptics and some non-Christians who stumbled across this information are aware the information even exists.

2.       How many Christians, regardless of their knowledge, celebrate Christmas in a pagan way? Almost none and that includes most progressive Christians.

Most adults don’t tell their kids that all the decorations and symbols associated with Christmas are of pagan origin. By the time people hit adolescence, if they were raised in even a nominally Christian home where Christmas is celebrated, the holiday and all the associated trappings and décor are firmly associated with Christianity. It isn’t until high school, college, or some other area of adult life that people first come across the idea that Christmas and other Christian holidays have pagan roots. Even after they hear of these alleged connections to paganism, most Christians can’t wrap their heads around the idea because the symbols have been so filled with Christian meaning in their minds.

When Christians begin decorating for Christmas, usually around Advent, they’re putting up decorations that they associate with Christianity: green is the liturgical color associated with eternal life, red is the liturgical color associated with sacrifice, blue and violet are the liturgical colors associated with divine royalty, white and silver are the colors associated with divine purity, and gold is the symbol of divinity. Circles symbolize eternity, stars symbolize the Star of Bethlehem, Candles represent the Christ as the light of life, the gospel as the light of truth, gift-giving is associated with the gift of the magi, bells represent proclamation of Christ’s birth, and on and on and on the list goes.

This is what Christmas means to most Christians. They see celebrating Christ’s birth in the weeks leading up to December 25th as an act of worship. Worshipping God is, biblically, the chief way Christians glorify Him. Thus, I assert that Christians who celebrate Christmas are not engaging in any pagan act.

This is why it doesn’t make sense to most Christians that anyone would criticize celebrating Christmas, and biblically why those who don’t celebrate Christmas should not criticize or look down on those who do. It goes the other way, too. If you celebrate Christmas, you have no right to criticize or look down on Christians who don’t.

 

 

Tuesday, December 7, 2021

Never Forget: the historical lessons of the events that led up to the attack on Pearl Harbor.



76 years ago, on December 7th, 1941, war was a far-off concept. It had been more than two decades since the United States had been involved in a foreign war.

The Japanese had invaded Manchuria ten years prior in 1931 and the atrocities they had committed against the Chinese were well known as they were often splashed across American movie screens during the newsreel segments. Americans were concerned and prompted their elected representatives to do something for the Chinese. While America was almost 90% white and anti-Asian sentiment and racism was widespread, there was still care for suffering people. At this point, nobody in America even thought about war. It was happening "over there," thousands of miles away. How could that possibly affect the United States? Outside of political, diplomatic, and charitable avenues, how was Sino-Japanese conflict any of America's business?

In 1933, a highly popular but controversial politician by the name of Adolph Hitler came to power in Germany. Much of the western world was concerned by his rhetoric, especially his stated intention to expand Germany's military in violation of the Treaty of Versailles and to expand Germany's borders to pre-treaty lines to give German citizens "lebensraum" (living space). Of course, most people in the United States felt that German citizens had been treated unfairly in the Treaty of Versailles, so maybe Hitler was right?

That same year, both Germany and Japan withdrew from the League of Nations. In 1934, Japan ended its disarmament policies and instituted a policy of armament and militarization without limitation.

In 1935, Italy had invaded Ethiopia. Hitler assuaged the western world's fears about Germany's intentions by providing military aid and equipment to the Ethiopians. Maybe Hitler wasn't such a bad guy after all. Americans against asked, "How does this affect us?" and the answer was usually, "It doesn't."

In 1936, Hitler finally violated the Treaty of Versailles, moving troops into the Rhineland Demilitarized Zone. Neither France nor Britain was really expecting Germany to do anything like that. Both countries reduced the size and scope of their military after World War I, so they weren't in a position to respond even if they could. To top it all off, many politicians had begun rethinking the Treaty of Versailles and so, neither France nor England did anything.

In 1937, Japan invaded China. Italy withdrew from the League of Nations. The United States had a heavy presence in the Pacific and China Sea. The Japanese attacked an American gunboat, the USS Panay prompting national outrage. But Japan apologized for the attack insisting the attack had been unintentional. Declassified intelligence has since revealed that was a lie, but Americans were isolationist and anti-war and were willing to take Japan at their word.

In 1938, Germany invaded Austria. English, French, and American politicians and ambassadors began reaching out to Germany and Japan for guarantees that their objectives did not involve further expansion and in hopes of ending the reported atrocities abroad. There were talks of war, but Neville Chamberlain went to Germany and came back with a signed agreement of peace from Adolph Hitler. Chamberlain declared it "peace in our time." Though Chamberlain was the Prime Minister of Britain, most Americans were happy to believe in "peace in our time."

In 1939, Germany invaded Poland. Germany had already threatened Great Britain and at this point, despite Chamberlain's piece of paper, he was willing to take Germany seriously and declared war on Germany.

The whole world seemed to be at war. At this point, many American citizens began to wonder if the United States would be drawn into war. While most Americans believed that England, France, the Netherlands, and other allied countries were justified in going to war with Germany, there was no popular support for the United States entering the war. Several American servicemen and other volunteers that wanted to fight left to volunteer in Great Britain.

In 1940, concerned by Japan's continuing atrocities and expansionist policies, the United States and several other countries stopped supplying Japan with the materials it needed to wage war. The United States passed the Export Control Act, which ended all US exports of oil, iron, and steel to Japan.

That set the stage. Japan reacted by forming an alliance with Germany, and Italy joined in as well, forming the Tripartite Pact. In July of 1941, the United States, the United Kingdom, and the Netherlands froze Japanese assets and cut off all exports. Japan had taken control of most of the Chinese coast, but Franklin Roosevelt, in spite of there not having been a declaration of war, established a fighter and reconnaissance squadron, called "The Flying Tigers" made up entirely of volunteer pilots. They raided and harassed Japanese shipping and supply lines in China.

Americans were still firmly committed isolationists where Europe was involved, but the continued reports of atrocities and the danger of Japanese expansion seen on the newsreel screens and on the radio every day made Americans accept the aggressive policies towards Japan. US-Japanese diplomatic relations completely failed in the summer of 1941. Unfortunately, there was a disconnect between the Japanese Army, which made all military decisions, and the Japanese diplomatic corps, which was still trying to find a diplomatic solution to Japan's relationship with the United States.

Germany has broken its alliance with the Soviet Union and invaded the country. Hitler asked Japan to wait before it committed to any action against the United States. However, the Japanese Army knew it needed oil to continue and expand its control of China. The only way to get it was to take over the Pacific oil fields controlled by the UK and the Netherlands. Japan knew they could not take over the South Pacific Islands as long as the United States had a naval force in the Pacific. Japan knew it was no match for the American Pacific Fleet. The Japanese Army hoped a decisive and crippling attack on the US Pacific Fleet would bring the United States back to the negotiating table and that it would result in a favorable outcome for Japan.

December 7th, 1941 was a Sunday. It was holiday routine for the military. Servicemen who weren't on duty had the day off. Most servicemen were either in bed, getting ready for church, or getting ready for other Sunday leisure activities. On Battleship Row in Pearl Harbor Hawaii, ships' bands were assembling on the fantails of their ships and the color guard was getting ready to raise the American flag for morning colors.

The quiet of that mild and sunny Sunday morning was broken by the sound of whining turbines. Sailors, Soldiers, Marines, and civilians all across Oahu looked up to see planes flying low and in formation over the island. Most thought it was some kind of exercise. Even when the explosions started, those not on the exploding ships thought there was no way this could be anything other than an exercise. But there were a few who saw the red "meatballs" on the bottom of the wings and knew this was an attack. Within seconds, servicemen broke from their Sunday reverie and engaged the enemy.

An hour and fifteen minutes later, 2,403 were dead. The dead included both military and civilian casualties—men, women, and children, the youngest of which was three months old. Without a doubt, Japan was at war with the United States.

The next day, President Franklin Delano Roosevelt spoke to a joint session of Congress, giving a speech that was broadcast over the radio to the entire nation. He said that "Sunday, December 7th, 1941..." was a "... day that will live in infamy..." and called on Congress to declare war against the Empire of Japan. The Senate voted unanimously and only one U.S. Representative, Jeannette Rankin (R-MT) voted against.

Hitler was frustrated because he knew he couldn't fight a war on two fronts and it would only be a matter of time before the United States joined its allies in Europe. On December 11th, 1941 Hitler and Mussolini joined with Japan to declare war on the United States. Congress wasted no time and declared war on Germany and Italy just a few hours later.

Men lined up at recruiting offices to volunteer for the fight against Japan. Even some boys lied about their age to get into action. The youngest man to serve in World War II, was Calvin Graham who was 11 when he shipped to San Diego for Navy Recruit Training in early 1942. In spite of the way they were treated, minorities stepped up to serve their country, too. In spite of the fact that, for the most part, minorities continued to be mistreated throughout the war, they distinguished themselves even in the most mundane tasks, which they were usually relegated to regardless of their training or experience.

Women also served. They were not allowed to serve on "the front lines" and yet, they died at the hands of the enemy in spite of that. Women earned the Bronze Star, The Distinguished Flying Cross, Purple Hearts, and other awards for heroism.

World War II was before my time. The youngest veteran of World War II was 42 the year that I was born. Most were in their 50s or older. In spite of the gap between the end of the war and my birth, I've learned a lot about it. Some I learned because it was taught to me in school. Much of it I've learned through choice because the men and women of that generation had a great deal of influence on my life both directly and indirectly.

December 7th was one of America's defining moments. World War II is, in my mind, one of the most important conflicts in American history and third in importance after the Revolutionary War (#1), and the US Civil War (#2). It was important for both good (liberation of France, concentration camps, beginning of major changes in attitudes towards women and minorities back home) and bad (continued segregation, relegation of minorities to menial jobs, internment of Japanese). In college, I took a course on World War II, and in the last meeting of the class, the professor, World War II historian, Dr. Timothy Orr said, "At 25 years after a war, we are just beginning to make sense of it. At 50 years after a war, we thank our veterans for what they did. At 75 years after a war, we begin to say goodbye to our veterans, and we are in that period now. At 100 years, it becomes the responsibility of historians to interpret a war without the shadow of the veterans looming over them." There are several things to unpack in that statement, but one thing really stands out, and that's the responsibility of historians, both amateurs and professionals, to record the memories and document the experiences of veterans of any war.

In spite of the fact that there were many problems with the wars in Iraq and Afghanistan (and I believe the latter was and still is fully justified), and the Global War on Terror (GWOT) as a whole, I truly believe that September 11th, 2001, and the wars in Iraq and Afghanistan will be historically defining for my generation, and for good or bad, that the GWOT will probably rank with the three conflicts I mentioned above as one of the most important conflicts in history. Whether the US is remembered for good or bad because of the GWOT, it's important to remember so we don't repeat the mistakes that got us there and to build on the successes.

The people of the greatest generation entered the last phase suggested by Dr. Orr (75 years after the war) last year. Only two survivors of the USS Arizona still remain. We lost two famous WWII veterans this past weekend. By 2045, barring major advancements in geriatric medicine, there will be no survivors of World War II and interpretations of the conflict will be left to the opinions of historians. The greatest generation vowed to "Never Forget" December 7th, 1941, and ensured that their children and at least six generations to date would not forget the meaning of December 7th. In spite of that, it has become academically popular already to question America's involvement in World War II in spite of the fact that it was Japan's attack on Pearl Harbor that drew us into war.

I posted the long narrative leading up to Japan's attack on Pearl Harbor because there is a popular argument in some parts of academia that Japan was justified in its attack on Pearl Harbor due to the United States and its allies cutting off Japan's imports of oil, iron, and steel. This argument was popularized in Howard Zinn's pseudohistorical anti-American polemic "A People's History of the United States." According to Zinn and his proponents, it was a combination of American imperialistic corporate greed and anti-Asian racism that caused the United States to cut off Japan. In order for that argument to work, you have to ignore a decade of Japanese imperialist expansion and military atrocities committed against the Chinese, which Japan excused, and numerous attempts at diplomatic solutions, which Japan ignored. In fact, even my summary was reductive because the Japanese nationalism that led to its occupation of China and expansion in the Pacific can be traced back to the Meiji Restoration beginning in 1868!

There are people who are young adults who have no memory of September 11th, 2021. They have no memory of it being a quiet Tuesday morning. They have no memory that much of the United States was still asleep or just beginning their day. They have no memory of the hundreds of lives that were lost in just a few minutes or the thousands that died in the hours after. They have no memory of the stories of heroism that likely prevented further death and destruction. The greatest generation has Dorie Miller, who even though he was untrained on anti-aircraft guns, manned a gun and shot down Japanese attack aircraft. We have Todd Beamer, who though being unarmed and untrained, led a resistance against the hijackers of his plane resulting in its neutralization. I've heard stories from people who were at Pearl Harbor including my great uncle Jack and heard the stories of horror, heroism, and miracle in the aftermath. Likewise, my generation has stories of the same.

It's up to this generation—my generation to make sure those stories and everything that led up to 9/11/01 continues to be told. Why? Because in 2046 Americans will just begin to make sense of the wars in Iraq and Afghanistan. If I'm still around, I'll already be in the final phase of my life. There's still work to be done for GWOT vets, but America has already done a tremendous job of thanking and taking care of its GWOT vets compared with previous wars. In 2071, many vets will still be alive, maybe even me, and the US will begin the process of properly thanking them. In 2096, I'll probably be long gone, but American veterans of Iraq and Afghanistan will begin their final phase as the citizens of the United States begin to say goodbye. That's 75 years from now. There will be GWOT vets alive long after 2096, but historians have 75 years left to properly record and memorialize what we did and what really happened. Again, it's up to us to make sure that the Howard Zinns of the future are not able to get away with polemic denigrations, distortions, misrepresentations, and flat-out lies.

I don't want to steal the thunder from our World War II vets who are still alive and remain. I've had the chance to meet several, talk to them, really get to know them. Most importantly, I've been able to hear first-hand accounts of people who served at Pearl Harbor, Normandy, support positions back home, and civilian memories, as well. We have a very short amount of time left with our World War II veterans, so for those that can, cherish them, talk to them, and remember what they tell you so you don't forget and then pass that information on to the next generations.

Never Forget.

Tuesday, November 30, 2021

Free Speech Means More Speech, Not Less

I had a conversation with a person a few days ago who is anti-free speech. In fact, we couldn't seem to find much common ground in our conversation and there is a lot from that conversation that I may eventually post on, but for now, I want to focus on the free speech issue.

The conversation began with her bemoaning the fact that Facebook is somehow responsible for all the misinformation that was spread over the past couple of years regarding COVID and other things. I know there is a story out there about the whistleblower saying Facebook is talking out of both sides of its mouth regarding the suppression of misinformation.

The problem is that there is very rarely a real person blocking posts, suspending users, putting warnings up in front of pictures, etc. Most of the time, there are algorithms that determine if a picture, meme, video, or other kinds of posts contain content that breaks Facebook's community rules, requires a warning, or requires a user to be suspended or warned.

Two personal examples illustrate the lack of people monitoring the information being posted on Facebook.

1. On May 26th of last year, my son and a friend of his rescued a dog. While we were trying to figure out what to do with the dog, we kept him at our house. On May 27th, I took a video of the dog and shared it on Instagram. That video is still up on Instagram despite being owned by Facebook. However, I selected the feature to share the video on Facebook and Facebook rejected the video saying it contained sexually explicit content and a warning was put on my account. I disagreed with the warning, but nobody at Facebook ever got back to me and there was no recourse because I'm a nobody. Because Instagram uses different algorithms for some reason, the video is still up there.

2. Ever since I started Facebook, I have always made it a challenge to debunk fake news. Within the last few years, an old meme started to make the rounds making a very old and easily disproven claim that Irish were sent to the United States as chattel slaves. I shared the meme with a rebuttal to the claim. More than a week went by and Facebook's fact-checkers finally caught up with me and decided to fact-check the meme as well. So, anyone who saw my picture after that saw a message that the picture contained misinformation. Evidently, that drew Facebook's attention to my page again.

In September, I shared a meme questioning the safety and efficacy of the COVID vaccine. Along with the meme, I posted a response. A few days later, I received a warning from Facebook. I forget the exact language and wish I had taken a screenshot. Essentially, the warning said that because I had shared misinformation memes in the past, they were going to limit the visibility of my posts. Now, I frankly don't care, but I did notice an immediate dropoff in interaction with my page. I don't make a living with Facebook, so it wasn't that important to me. Recently, however, a friend of mine saw a post and said it had been a while since he had seen anything from me. So, Facebook obviously throttled back the visibility of my page.

In their rush to suppress misinformation, Facebook created an algorithm that suppresses speech that is also trying to counter misinformation. And as I noted above, when you are accused by Facebook or Twitter of spreading misinformation based on a picture rather than the content of any comments on your post, or if you're accused by them of doing anything that violates their community standards, there is often very little recourse for the average user, and by "very little" I mean, "none."

I told these stories to the woman who was anti-free speech, and she said something to the effect of, "Well something has to be done to combat misinformation." I agreed with her, but I quoted Supreme Court Justice Louis Brandeis who said, "... the remedy to be applied is more speech, not enforced silence." And of course, she responded with a good old-fashioned "yeah but..." "Yeah, but because of the internet, misinformation spreads so much faster."

I told her those same kinds of arguments were made in regards to the telegraph, telephone, radio, and television. Even in the early days of the internet when it was mostly bulletin boards and chat rooms, and e-mail was in its infancy, politicians worried about how the information superhighway would be abused to spread misinformation. Level heads prevailed, at least to some extent, because Brandeis was right: the remedy is more speech, not less.

My debate opponent persisted that the internet is a different animal. I conceded that, but I also said that it is a much harder animal to control. I made the argument that when Facebook and Twitter started cracking down on free speech because of what they considered to be hate speech and misinformation, people didn't just stop spreading misinformation because they were locked out of Facebook, they went to MeWe and Parler.

Instead of creating an open forum for the free exchange of ideas, Facebook tried to turn itself into an echo chamber of mostly progressive and leftist ideas. Those on the right went to MeWe and Parler and those social media platforms became an echo chamber of mostly conservative and rightwing ideas. Now, Donald Trump is going to create his own social media platform that I guarantee will be completely avoided by moderates and the left, except for media types and politicians, but his social media platform will become an echo chamber of far-right ideas.

Today, John Stossel posted a video that brings something to light that is even more concerning. Facebook isn't just suppressing right-wing ideas. It is suppressing any information that conflicts with the societal narrative it is trying to create. Thus, even left-wing writers, scientists, politicians, and entertainers are getting "fact-checked," censored, and blocked, because the information they are sharing, even if true, conflicts with Facebook's narrative.

This brings me back to my conversation/debate a few days ago. "When you suppress free speech..." I began, but she interrupted me to inform me that she doesn't like the word suppress. She said what Facebook and Twitter are doing is a corrective to misinformation, which as of today we know is not true. I told her she can call it whatever she wants to, but the reality is that a "corrective" doesn't make misinformation or speech you don't like disappear. Because some of the information that Facebook is censoring is coming from legitimate sources, think tanks, educational institutions, research organizations, and so forth, that information will not be suppressed at all. It simply won't be available to the masses on social media. When you suppress misinformation, it also doesn't go away. It goes into an echo chamber where the misinformation will grow, fester, and likely turn into something worse.

This is an argument I've been making for years, so it was very easy for me to make this argument when I had this conversation. I think that she realized that she wasn't going to change my mind, so she changed the subject, which she did on a number of occasions. She even tried to end the conversation by saying she needed to go eat lunch because she had low blood sugar, but then wanted to get the last word in, which meant our conversation continued and meandered through a number of topics.

What I did not get to tell her, because of the conversational course changes, is that what Louis Brandeis called "enforced silence" and what I call "suppression of free speech" is not the action of a person who loves liberty. Advocating for the suppression of speech you don't like is intellectually lazy when it is espoused by individuals and it is tyranny when it is espoused or actually done by politicians in power.

Noam Chomsky, with whom I disagree on a variety of subjects politically, economically, religiously, and many more philosophical subjects, has a great quote about free speech. He said, "Goebbels was in favor of free speech for views he liked. So was Stalin. If you’re really in favor of free speech, then you’re in favor of freedom of speech for precisely the views you despise. Otherwise, you’re not in favor of free speech."

Another point I have been making for years is that when the precedent is set that the government can suppress speech, the party in power gets to decide what is acceptable speech and what is not. Democrats, do you really want Republicans to decide what is and is not acceptable speech? Republicans, do you really want Democrats to decide what is and is not acceptable speech? I'm independent, so I don't want any political party, politician, pundit, or individual telling me what is acceptable speech.

If someone is saying something you don't like, you need to figure out a good way to counter those words, and shouting them down, or worse, reacting with violence is not the remedy or productive in any way. In fact, those are the kinds of tactics that fascists and tyrants actually use. If you say you're anti-fascist but you shout down and attack people that you disagree with, you are actually a fascist.

Study the things you're passionate about. Learn the arguments for and against. Have rational conversations, even if you think your opponent is irrational. You may find that you change your mind about some things. Of course, most people are subject to confirmation bias, so study may not change your mind. Whether or not study changes your mind, you'll win more people to your side with polite conversation than you will by lazily shouting them down. Sure, study takes work and it's going to be harder, but if you want to win debates, and you won't win them all, you're going to need to work at it. That's why I said advocating for or simply being in favor of suppressing free speech is intellectually lazy.

There are also those who say that speech should be suppressed because words themselves can be violence. If words are violence, well, to quote Bill Maher, who strangely seems to be one of the few people who is still making sense these days, "Sticks and stones may break my bones, but words? Well, if you don't know how the second part goes, you need to go back to kindergarten."

Thursday, July 2, 2020

Independence Day—July 2nd or July 4th?

John Trumbull. 1818. “Declaration of Independence.” U.S. Capitol. Washington, D.C.

From the end of the French and Indian War until 1770, the conflict between Great Britain and the American Colonies had intensified into full-blown violence, culminating with the Boston Massacre in March 1770. For the next five years, small skirmishes took place throughout the Colonies. Then, in April of 1775, British troops were ordered to confiscate weapons from town arsenals and other military equipment caches, and to arrest leaders of organizations who were calling for rebellion against Great Britain. On April 19th, 1775, in the Battle of Lexington and Concord the Colonial militia drove the British military back to Boston.

The Colonies were not fully ready or willing to go to war against what was at the time the most powerful military in the world. On July 5th, 1775, the Second Continental Congress drafted the Olive Branch Petition in hopes that the conflict could be put to an end. Many members of Congress were upset that any attempt to appease Great Britain was being made, but Congress realized that the Olive Branch Petition could fail and on July 6th drafted the Declaration of the Causes and Necessity of Taking Up Arms. The next month, Great Britain issued the Proclamation of Rebellion after King George III learned about the Battle of Bunker Hill which took place on June 17th, 1775.

In spite of that, the Colonies were still willing to negotiate peace and continued to make attempts until the summer of 1776. In June, "the Committee of Five" was selected to draft a declaration. The committee consisted of John Adams (Mass.), Benjamin Franklin (Penn.), Thomas Jefferson (Virg.), Robert R. Livingston (N. Yk.), and Roger Sherman (Conn.).

On July 2nd, 1776, Richard Henry Lee of Virginia introduced the following resolution to the Second Continental Congress: "That these United Colonies are, and of right ought to be, free and independent States, that they are absolved from all allegiance to the British Crown, and that all political connection between them and the State of Great Britain is, and ought to be, totally dissolved." Thus, July 2nd, 1776 is the date that the Second Continental Congress declared American independence from Great Britain.

John Adams, who had been frustrated with attempts at peace negotiations over the previous year was elated and wrote to his wife on July 3rd, 1776, telling her, "The Second Day of July 1776, will be the most memorable Epocha, in the History of America. I am apt to believe that it will be celebrated, by succeeding Generations, as the great anniversary Festival. It ought to be commemorated, as the Day of Deliverance by solemn Acts of Devotion to God Almighty. It ought to be solemnized with Pomp and Parade, with Shews, Games, Sports, Guns, Bells, Bonfires and Illuminations from one End of this Continent to the other from this Time forward forever more."
John Adams and Thomas Jefferson

If America officially declared independence from Great Britain on July 2nd, why do we celebrate on July 4th?

After the measure was passed, a committee was formed to create a document that would explain the resolution and its reasoning to the general public. The committee of five, already at work on the declaration went back to work in earnest and by July 4th, they had all the elements they believed would be required. Thomas Jefferson was recognized as the most eloquent writer and was assigned with the final draft which was then taken to a printer that day. Thus, the header of the Declaration of Independence reads, "In Congress, July 4, 1776"

200 broadsides were printed and sent out throughout the colonies. 26 of those still exist. The original handwritten version wasn't signed until August 2nd, 1776. Since members of Congress were pretty tied up with the war, they didn't think about the Declaration again until July 3rd, 1777 and July 4th seemed to make sense as the day to celebrate independence. John Adams and many of the Federalists still believed that July 2nd should be the date, but Thomas Jefferson and the Democratic-Republicans believed it should be July 4th and the argument continued until 1812 when the Federalists faded away as a party. After that, July 4th was cemented as Independence Day.

On July 2nd, 1826, Thomas Jefferson wrote his final letter and commended future generations to remember and celebrate Independence Day, not just as a day to remember America's independence, but as the day that the first world government recognized all human rights. He wrote, "For ourselves, let the annual return of this day forever refresh our recollections of these rights, and an undiminished devotion to them."

Both John Adams and Thomas Jefferson died two days later on July 4th, 1826.

Friday, June 12, 2020

History is revised all the time and that's okay


The term "historical revisionism" is too often misused. Revising history is why historians exist. When most people talk about "historical revisionism" what they really mean is denialism, negationism, or distortion. Denialism and negationism are essentially the same things. Denialists say historical facts did not actually happen. For instance, there is a growing number of people who adhere to the negationist belief that the Holocaust did not take place.

Often kissing cousins to denialism/negationism is historical distortionism. Distortionists take well-known events, people, and places and change them and their history to fit a philosophical agenda. A mild example of this, using the Holocaust as an example once again, would be those who say, "Well, yeah, the Nazis did kill six million Jews, but they killed a lot of other people too," in an attempt to minimize the effect the Holocaust had on Jews as a people.

A more common type of distortionism happens when advocates of a specific philosophical agenda pick and choose which facts to believe, teach, or emphasize when teaching history. In history and especially Latin American history, we refer to this as either the White Legend or the Black Legend.

The White Legend is a version of history that focuses on a specific group of people as heroes of history. They were all great, they were all godly, they were all brilliant, and their lives should be emulated. In the study of Western Civilization, that would be like focusing on the fact that Greeks developed democracy while ignoring the fact that most of the population of Greece were slaves and were completely disfranchised. Scholarly adherents of the White Legend might concede the existence of slavery while qualifying that concession with "yeah but."

The Black Legend, on the other hand, vilifies the heroes of the White Legend and even when talking about their achievements, they do it in a way that shines the brightest lights on their flaws and misdeeds. The Black Legend also will focus on oppressed peoples as the heroes of the story, often ignoring the flaws of the oppressed that they criticized in the oppressors. An example would be the growing narrative of the colonization of the New World as the worst thing to ever happen in the history of mankind and the Old World, especially Europe should be apologizing for ever setting foot in the New World. Black Legend historians also frequently commit the historian's fallacy, that is they judge the past and the people of the past by today's standards, rather than judging them based on the standards of the past and recognizing that those people were products of their environment and upbringing in that society and culture of the past.

The reality, where good history is involved, lies between the two. We can look at the achievements of people of the past and say, "Wow! That's really something." We can also look at their flaws and the things that we would consider evil in our time and say, "Wow. That's really bad." The trick is to be both unbiased and nuanced. Something that is woefully missing from the public narrative and the teaching of history in too many schools. Frankly, I think I was lucky in college to have a majority of professors who at least attempted to remain unbiased and present a nuanced version of history. That said, one of my favorite professors was openly biased about some things in history. He and I disagreed on how Thomas Jefferson should be approached on day one of the first class I had with him. I took that professor twice and loved his class both times and even got As without sacrificing my approach to history.

The worst kind of distortionism, albeit also the rarest, is when "historians" just make up history.

A few years ago, Virginia came out with a new history textbook for elementary school. In the 4th grade text in the section on the Civil War, there was a claim that free blacks in the South and slaves actually served as soldiers in the Confederacy. No such thing happened. When the author, Joy Masoff was asked why she included the section, she said she wanted to "add a little something extra."

A few months ago, I read an article on the website, We Are the Mighty, titled, “5 cringeworthy military slang terms (that we should actually retire),” that suggested the term "in country" was a shortened form of “Indian Country,” and was used in the military to mean enemy territory. Having been in the military, the only use of the term "in country" I have ever heard referred to actually being in a country while deployed. The author of the article, Blake Stilwell was suggesting that the term, “in country” no longer be used because of its racist heritage. The author of the article provided a link to order Roxanne Dunbar-Ortiz's book, An Indigenous Peoples’ History of the United States where the article's author found the claim. I had never heard this claim before, so first, I searched to see if this claim had been made by anyone else. It had not. Second, I looked up the etymology of the phrase "in country" to see its history. The phrase "in country," according to the Oxford English Dictionary was first used in 1560 in England and was actually the shortening of the phrase "interior country" meaning the interior regions of any country. The phrase’s first use in the 20th century was in 1953, in a book of poetry by Dylan Thomas called, A Prospect of the Sea and meant being in a specific country. So, then, I got a hold of Dunbar-Ortiz's book and looked up the sections where she refers to "in country." One section claims that the phrase “in country” was a shortening of the phrase "Indian Country" and originated in the Vietnam War, which according to the Oxford English Dictionary is incorrect. So, I then looked up her sources in the footnotes and bibliography. She had one footnote for both times she made her claim, but that footnote just describes what qualifies as “Indian Country” according to political scientist Sharon O’Brien in her 1993 book, American Indian Tribal Governments. The term “in country” does not appear anywhere in O’Brien’s work either. Without a source to back up her claim and with scholarly sources that actually counter Dunbar-Ortiz’s claim, one can only assume that she made it up.

Often, fabrications, like Masoff’s claim about black Confederate soldiers, are easy to spot and debunk. Fabrications like Dunbar-Ortiz’s claim are more insidious because she has a Ph.D. in History from a respected university and most people will simply take her claim at face value and then repeat it, even in academic settings. I took a humanities course in college and the professor with a Ph.D. repeated the “rule of thumb” urban legend as fact. This particular urban legend states that English Common Law stipulated that a man could beat his wife as long as the rod he used was no wider than the width of his thumb. This urban legend is easily disprovable, but because it is often repeated as factual in feminist academic circles, it continues to be treated as fact in many places in the rest of academia.

Teaching history is difficult enough as it is because there is a lot of it. History teachers and curriculum developers have to balance teaching good history along with trying to determine the most important topics to cover in the time allotted. On top of that, there are political forces at work, pulling from all directions that demand their important topics are covered as well. When the curriculum is finalized and shows up in school districts, teachers have to figure out how to teach the curriculum in a way that also meets the demands placed on them by standardized testing objectives created at the state and federal levels.

Primary school teachers are rarely history majors and get an awful lot wrong. I first learned about Christopher Columbus in second grade and was taught that Columbus set out on his 1492 voyage to prove that the world is round when most of the people of his day still believed that the world was flat. Neither of those claims is true. Most people in 15th century Europe believed the world was spherical and had for centuries. Aristotle is often credited with being the first to claim the earth was a sphere, but people for centuries before Aristotle—Egyptians, Greeks, Hebrews, Mesopotamians, and Phoenicians—had an understanding of a spherical earth, especially mariners. I was also taught that Columbus discovered America, but the fact that there were humans in the Americas for thousands of years makes that claim clearly incorrect. Even the claim that he was the first European to set foot in the New World is wrong because the Vikings accomplished that hundreds of years before Columbus.

In secondary schools (middle schools and high schools) history teachers are just as often history majors as they majors of other social sciences. A teacher I had the opportunity to observe who teaches history was a political science major and has a J.D. She told me that she knew a lot about history and the facts behind most of the laws she teaches in her government classes, but she was at a loss on how to teach straight history. She still sends me notes from time-to-time for advice on certain subjects.

The point I am trying to make is that teachers in public schools, regardless of what their biases may be—and yes they have biases and yes they sneak them into their curriculum—have an incredibly demanding job just trying to meet the standards. People often tell me, “So much has been erased from history books.” I have to respond, “False.” The information is still there, but there is not enough time to teach everything that everyone wants to teach. In Western Civilization, when teaching about the development of democracy in Greece, I really wish more time was spent on Cleisthenes, the man who essentially invented democracy, how he came up with the idea, why he came up with the idea, and a little more of his background to really give a context for his invention of democracy. Yet, I never heard of Cleisthenes before I graduated from high school. In my Western Civilization course in college, Cleisthenes was mentioned one time in one sentence in the one chapter on Greece that covered Greece’s pre-history through Hellenistic Greece. My professor, who spent three lectures on Greece never mentioned Cleisthenes one time. I watched a documentary on Ancient Greece last week, a documentary I thought was fabulous by the way, yet Cleisthenes was only mentioned in passing at the end of the second episode. I think Cleisthenes is the biggest hero of Greece’s Golden Age, but to most historians, he’s a footnote.

Some people cry, “I can’t stand the way history has been changed.” History is going to change. It has to. I mean, if we have all the documentation on a person, place, or event, history may not change much, if at all. However, because of human nature, historical evidence gets lost, destroyed, misplaced, or hidden and it takes years, sometimes, for that evidence to come to light. Sometimes, when that evidence comes to light, it completely changes the way historians understand and interpret history and history has to change. I’m writing this essay because of the way so many people recently have been making this very complaint.

Just the other day, I saw a meme on Facebook that claimed that the Pyramids of Giza were built by slaves. I made a simple response challenging that idea since archaeologists and historians no longer believe this. Between finding no slave burials at or near the site, the care in which the people who worked on the pyramids were treated when they died, the records the workers left behind, and numerous other archaeological findings, it became clear to archaeologists, historians, and Egyptologists that slaves were not involved. My response was intended as a light-hearted comment that I hoped would direct people to look up the information. I even included a line in my response that the pyramids were also not built by aliens or with spaceships and ended it with a “winking” emoji. My claim was met with abuse. I’m not easily offended, but I figured the replies I had received were just based on ignorance, so I responded by posting an article by Zahi Hawass, who is the world’s leading Egyptologist. It was not a scholarly peer-reviewed work, but I made the mistake of believing that Hawass was well-enough known to be respected. I was wrong. Hawass was accused by one commenter of being a racist elitist and that the article I posted was merely his opinion. I gave up, because willful ignorance is hard to overcome. Many people came to my defense and Hawass’ article replying with comments about archaeological findings and mentioning articles and books to read, but at some point, you’re just feeding trolls. It’s best to just let them starve.

“They’re not teaching history the way I was taught history.” The way history is taught is going to change over time. It has to. Too many people contributed to history that have been left out of the story. American history, for instance, is usually taught as a line of progress, led by white men. Yes, white men enslaved Africans, but who freed the slaves? Abraham Lincoln who was white. Women fought for the right to vote, but who gave them the right to vote? The mostly white, male Congress when they passed the 19th amendment. African-Americans fought for their very Civil Rights, but who gave it to them? The white President Lyndon Baines Johnson.

I must concede that my statement in the above paragraph was slightly hyperbolic. Obviously, there are some non-white, non-male people who have been included in the teaching of history, but has history really been all that inclusive? Let me use the Revolutionary era and the Revolutionary War as one example. Most Americans are familiar with the Boston Massacre which took place on March 5, 1770. Many Americans might even be familiar with the fact that the first man to die in the event was Crispus Attucks. However, most people don’t know that he was black, and fewer still know that he also was part Native American. How about the famous Midnight Ride of Paul Revere? Most people don’t know that there was another midnight rider that night. While Revere rode west, another man, Wentworth Cheswell rode north to warn other communities. Wentworth Cheswell’s mother was white. His father was black. There were other midnight riders as well, one of them was a woman, Sybil Ludington who made her midnight ride April 26th, 1777. She rode 40 miles (twice the distance of Paul Revere) to warn militiamen in Putnam County, New York that the British were going to attack a Continental Army supply depot in Danbury, Connecticut. Deborah Sampson was a woman who dressed up like a man to fight in the Revolutionary War and received a pension after the war, even after her secret was discovered and in spite of her having broken the law concerning women in military service.

There are even more stories of brave men and women of all ethnicities who fought and died for what would become the United States of America. We can’t tell all their stories in a single curriculum, but we can tell more than we have, which is why history isn’t taught the way I was taught or you were taught, nor should it be. When we do bump up the untold stories up the list of priority, someone who previously had their story told gets bumped off the curriculum. Their stories still exist, but people who are generally interested and concerned about history will have to do some extra reading.

While most historians attempt to remain unbiased, history is still a subject of some interpretation. I don’t mean that historians interpret something to have happened or not, because the evidence provides us with the answers to the questions of who, what, and where. Historians often have to answer the questions of why and how and it is absolutely impossible to prevent any biases from slipping in. Even so, historians can still come to a consensus about some interpretations simply because the subjects of history often leave the answers to all the questions.

History is going to change and that’s okay. It changes every day. As long as people exist in places and do things, more and more will be added to history, which makes it change. History also changes as new evidence is discovered, which also happens every day. Those discoveries affect not only our knowledge of history but our interpretations as well. Attitudes, ideas, and cultures change, and as long as they continue to change, interpretations of history, where there is room for interpretation, will change along with them. If history doesn’t change, there is no purpose for historians. It’s also important for non-historians to know history, though, because as George Santayana is often quoted as correctly observed, “Those who cannot remember the past are condemned to repeat it.” If remembering the past were not important, then that would be the final nail in the coffin for historians. Knowing history also means knowing that change happens not only in history but how history is done. That's what revision is and that's why revision is actually a good thing.

Friday, April 10, 2020

Azulão

Composer: Jayme Ovalle (1894-1955)
Lyrics: Manuel Bandeira (1886-1968)

In early March, cellist Yo-Yo Ma encouraged musicians around the world to create music during the COVID-19 global pandemic and attach the hashtag #SongsOfComfort. I first fell in love with this song in 1990 and it has always brought me comfort. So, using my very limited recording capabilities, I recorded this song in my home on March 19. While I uploaded the video on Facebook and Instagram, I completely forgot to upload it here. I hope people will enjoy and take comfort during their time at home.

#SongsOfComfort #StayHome #WithMe #QuarantineAndChill #FlattenTheCurve

Sunday, March 15, 2020

The Coronavirus, Giving Feet to Your Prayers, and Not Foolishly Tempting God


“Then saith he unto his disciples, The harvest truly is plenteous, but the labourers are few; Pray ye therefore the Lord of the harvest, that he will send forth labourers into his harvest.” (Matthew 9:37-38)

There is an old saying: “Put feet to your faith,” or alternatively, “Put feet to your prayers.” In other words, prayer is necessary and important. As Hank Hanegraff has famously said, “Prayer is firing the winning shot before the battle ever takes place.” However, it’s also important to understand that God usually uses people—His people—to accomplish His purposes. In the verse quoted above, Christ tells his disciples that laborers are needed to accomplish God’s purposes. Then, in the very next chapter, Christ sends his disciples to accomplish His purposes (Matthew 10:5). Christ, in those passages, sets the example that we need to pray and then we need to go do something, when we are able, to accomplish what we are praying for.

There was a man who lived alone in an area prone to flooding. One year, a storm came through that threatened to breach dikes and the potential for flooding was extremely high. Officials warned people living in the area where the man lived that the whole area could be submerged and ordered an evacuation. The man laughed at the television broadcast and said, “The Lord will provide.”

As the storm began to roll in, local government officials began going door-to-door to warn people to evacuate. A police officer in a truck came to the man’s house, knocked on the door, and warned him that the river levels were rising, could breach the dikes any minute, and told him he needed to evacuate. The officer even offered to assist the gentleman in gathering any necessities. The man simply waved the officer away saying, “The Lord will provide.”

As predicted, the river breached the dikes and soon the whole area was flooded. At first, the man simply went up to the second floor of his house. A man in a boat drove by and noticed the man in his home. The boatman told the man to get in the boat and the man waved him away with cries of, “The Lord will provide.”

Hours later, the man was forced to the roof of his home as the river waters overwhelmed his second floor. A helicopter from the Coast Guard flew over and noticed the man. They lowered a ladder and told him to climb in, but the man waved them away with the same cries of, “The Lord will provide.”

Finally, the water overwhelmed the house, the man was swept away and drowned. As he stood before God, he said, “I thought you would provide.” In response, God said, “I provided a news report, a truck, a boat, and a helicopter.”

When God told Moses that He would deliver Israel, He also told Moses that he would be the instrument of delivery. James tells us to be “doers of the word and not hearers only” (James 1:22). James also shows us what that means:

“If a brother or sister be naked, and destitute of daily food, And one of you say unto them, Depart in peace, be ye warmed and filled; notwithstanding ye give them not those things which are needful to the body; what doth it profit? Even so faith, if it hath not works, is dead, being alone” (James 2:15-17).

A person can’t depend on faith that food and clothing will just magically appear. Something needs to be an instrument of provision. Usually, the instrument of provision is a job, but when people fall on hard times, they turn to charity and God’s people should then become that instrument of provision.

We may pray for the poor, but we also need to provide for the poor. We may pray for change in the world, but perhaps we should take note of Mahatma Gandhi’s advice to “Be the change you want to see in the world.”

Often, when tragedy strikes, you will often hear Christians say they are sending their thoughts and prayers. They are then immediately ridiculed by unbelievers who say, “You can pray all day long, but what good will it do? Maybe you should go and do something!” Good point and James makes this very same point:

“Yea, a man may say, Thou hast faith, and I have works: shew me thy faith without thy works, and I will shew thee my faith by my works. Thou believest that there is one God; thou doest well: the devils also believe, and tremble. But wilt thou know, O vain man, that faith without works is dead?” (James 2:18-20)

This is the same conversation—an unbeliever is ridiculing a believer who sits around and does nothing while the unbelievers are out working and making changes in the world. James says, faith is not enough. Even worse, faith without works is dead. You can believe all the right things and even be sure of your own salvation, but what good does that do if you keep it to yourself? The word “dead” is correctly translated, but it is used here as a euphemism for being worthless, good for nothing.

As the song, This little light of mine says: 


Hide it under a bushel?
No!


It’s not enough to say, “I have faith.” It’s not enough to say, “I’m thinking of you and praying for you.” One must show their faith through action. As James wrote, “I will show thee my faith by my works” (James 2:18).

Now, if a family’s house burns down, you may not be able to provide a new house or the long-term housing for a displaced family by yourself, but that’s where the community of faith comes in. The whole church can come together and help those in need. The church I go to has a winter clothing drive in the fall and provides clothing to those in need throughout the year. The church also has a food pantry to help people in need of food. My church is not unique in these ministries. When people are in need, the church can provide all kinds of help to its members and I can speak from personal experience because I have been both the recipient and benefactor of charity through my local church.

To those who mock believers for sending thoughts and prayers, keep in mind that when tragedy strike, it is usually believers as part of the community of faith who are the first on the scene providing aid, comfort, and helping in whatever way they can. When hurricanes hit, churches are usually already assisting in the recovery before FEMA and other government agencies can even get mobilized.

Along with the strange idea that God will magically provide through prayer, as if God has a Star Trek transporter in heaven that He uses to beam down our needs, is the idea that we can put ourselves in unnecessary danger and if we pray to God, He will deliver us from all harm. Granted, if it’s God’s will, He can do some amazing things and even deliver us from all harm, but that doesn’t mean we should tempt God—that is we should not put Him to the test.

In the early 16th century, the plague hit Saxony (part of modern-day Germany). Wittenberg, the home of one of the most famous men in Church history, Martin Luther was especially hard hit. COVID-19, the novel coronavirus, is unlikely to affect humanity on the same scale like the plague, but Martin Luther’s response seems both relevant to the discussion of putting our feet to our prayers and not foolishly tempting God by putting ourselves in unnecessary risk. One pastor, Dr. John Hess wrote to Luther asking for advice on how to deal with the plague. Martin Luther wrote this in response:

“I shall ask God mercifully to protect us. Then I shall fumigate, help purify the air, administer medicine and take it. I shall avoid places and persons where my presence is not needed in order not to become contaminated and thus perchance inflict and pollute others and so cause their death as a result of my negligence. If God should wish to take me, he will surely find me and I have done what he has expected of me and so I am not responsible for either my own death or the death of others. If my neighbor needs me, however, I shall not avoid place or person but will go freely as stated above. See, this is such a God-fearing faith because it is neither brash nor foolhardy and does not tempt God.”

Christ told his followers the pray with the faith that they would receive what they prayed for. “Therefore I say unto you, What things soever ye desire, when ye pray, believe that ye receive them, and ye shall have them” (Mark 11:24). However, James also tells us that people often do not have their prayers answered in the way they expect because they were praying selfishly. “Ye ask, and receive not, because ye ask amiss, that ye may consume it upon your lusts” (James 4:3).

Praying selfishly is just another way of putting God to the test. Praying something like, “God. I’ll believe you exist if you give me a million dollars,” or “God, if you make my debt go away, I’ll become a missionary” are hypothetical examples of how people tempt God—put God to the test—through prayer.

There are four lessons to take away from all this:

1. Pray selflessly. It’s okay to ask for your needs and your health, but try to remember that it’s not just about you.
2. Pray with faith that you will receive what you ask for. Don’t just go through the motions of prayer thinking it’s meaningless. Prayer has power. But…
3. Pray with action. God usually uses people to achieve His purposes.
4. Do not pray in a way that puts God to the test. Don’t foolishly put yourself in harm thinking that God is going to protect you every time, and don’t pray selfishly, especially when you know you’re asking for something that God does not want you to have.

Think about it.