Willful Moral Ignorance


Biblical Text: Mark 9:30-37
Full Sermon Text

This text is the second passion prediction and a unique to Mark saying of Jesus. The saying is mirrored a couple of other places in very similar sounding ways, but the setting and the vocabulary of this text are unique. Unique enough to support this sermon. The central theme or problem is what do you call it or what happens when you know the moral path but are afraid. And this is tied to a very specific living example.

Sermon Grist File

RossettiInstead of ghoulishly watching the pictures from Boston, I’m cleaning out my “Sermon Grist” file.

Jonathan Haidt who recently moved to NYU Stern school of business on the ethics of MBA students vs. Psychology students. Sometimes I wonder to what degree the people we spend most of our time with shapes our beliefs. By that I mean if you spend you time surrounded by “feeling” people who are generally well adjusted, to what extent does that encourage you to generalize that is humanity? How has my history (vs. what I would say is biblically gleaned) shaped my rather bleak anthropology.

Two great articles by Eve Tushnet. This is a look at a Pre-Raphaelite exhibit. If nothing else the Rossetti Annunciation is worth viewing. This is a reflection on a beauty of another sort – the fact that we all trust or are obedient to something. And the danger in picking (or refusing to pick).

Knot Yet looks at how modernity by modifying and refusing the strictures of marriage have often ended up places much more sordid.

Whether or not they realize it, today’s twentysomethings are entering wayside stations that, as the “Knot Yet” report makes clear, lessen their chances of ever entering the promised land of stable marriage. The marriage passport fee seems too expensive, and they can’t give up other choices. So instead they opt for locations that, according to Wharton and implied by Austen, are “smaller and dingier and more promiscuous.” It seems by not choosing to give up some things, it’s possible to give up everything.

This is Hunter Baker making an insightful comment about sanctification, based on a Tim Keller Comment to this guy’s question.

When asked about obstacles to revival, Keller pointed to fornication. In other words, it is difficult to spiritually awaken people who have hard-wired a particular sin into their lives and have essentially committed to it. If repentance means a large structural change, such as ending a co-habiting, sexual relationship, then it becomes that much less likely.

What I would say running throughout all those things is the idea of submission. From the very general, what ethics do MBA’s submit to (basically is it legal?), to the specific, a submission to the will of God that offers a witness to the world.

Ethical Subroutines

The picture nearby is the old Philosophy of Ethics standby. Do you pull the switch? What right do you have to pull/not-pull the switch?

This article from the New Yorker takes a look at a modern twist on that story. I think I’ve linked to the Google driver-less car before. Its a neat project and further along than we might think. Driver-less cars will become standard within my lifetime most likely. What ethical subroutine do you program in? A cat jumps out in front of the car. Does the robot driver swerve at risk to the passengers or just go bump? What do you do? Make it harder. A child runs out in the street. There is no way to stop the car. Does the robot driver crash the car injuring the passenger? What if there are three passengers? Remember that the robot driver’s decision making is much faster than a human. The human might not have time to react. The robot does. What kind of ethical subroutine do you program? Who gets to choose? GM? Toyota? IPAB?

And those might be easy compared to the real end game. All those drones that the US is using to kill foreigners and create all kinds of collateral damage…just the first wave. The drones have humans controlling them at all times. We can argue about the ethics of drones, but if we don’t like how they are used we can vote in new administrations. What happens when the US replaces the big red one with the droid army? Robot soldier in an insurgent environment like say Afghanistan. What are the ethics to be programmed in a situation where jihadist and child could both be coming around the corner?

How do you teach ethics to a machine? How do we teach them to our kids? How were they taught in the past? Right now the default switch on all of that stuff is Utilitarianism. And machines can be strict utilitarians unlike most humans who are only so in the abstract. Religion doesn’t boil down to ethics, but ethics has until the enlightenment project been seen as a subset of religion. We are entering a world where religion-less ethics are being encoded. Are you hopeful about that?