Wednesday, March 28, 2012

A Programmer's View of Morals

Here's an age-old question.  Are humans inherently moral?  That is, are they born with a sense of compassion and respect for others?

Conventional theories are as follows:

Yes, humans are inherently moral.  Emotions are not learned.  When a baby is born, it can instantly recognize its mother.  It is encoded in the human psyche that treating someone poorly will result in being treated poorly in return.

No, on the other side of the spectrum, humans are not inherently moral.  This is why early parenting is so crucial in making a child fit for society.  This is why children with unhealthy home environments tend to develop social disorders.  Morals, and consequently, the ability to assimilate into society, are learned.

Here's the problem with both theories:

Genetic makeup is not hard-coded.  As far as science tells us, genes tend to be much more abstract than just "the gene that makes you respect people" or "the gene that makes you love people."  If that were the case, and things like respect and love could be hard-coded into the psyche, then morals can be, too.

And sure, there's a trend between good parenting and social assimilation.  But this doesn't tell the whole story.  Once again, inherent morals would have to be directly integrated into human biology.

My purely theoretical take on this:

My theory depends on the human brain working as a neural network where people are born with a randomized genetic makeup and filtered out via natural selection.

Quick overview.  Neural networks are made up of neurons that receive inputs and fire outputs.  If all of the inputs and their weights surpass a certain threshold value, then the neuron fires all of its outputs into other neurons.  Otherwise, it stays silent.  Think computer chips.


Now, these neurons (we'll call them "nodes") are separated into three types: Input, Hidden, and Output.  The Input layer contains nodes connected directly with the senses (sight, sound, etc).  The Hidden layer contains all the intermediate nodes that analyze the Input signals.  Lastly, the Output layer links the analyzed reactions directly to the body's motor functions.

Nodes are given weights for how important they are.  In this cases, this animal trusts its sight much more than it trusts its hearing.  Similarly, fear plays a bigger part than fatigue in deciding whether to run from a perceived threat or not.

To get the question of inherent morals out of the way, notice that emotions (and hence, morals) have no place in the Input or Output fields.  They remain only in Hidden, if they exist at all.  Wouldn't that make them hard-coded?  Yes and no.  In a very complex neural network, it's safe to assume that a great many neurons are connected to a great many other neurons.  Which means the node that represents "fear" right now is probably only labeled that way because it's weight is very heavy in determining the flight response.  If that weight was low, it may as well just be an intermediate node for another hidden node.  Although this is highly arguable, it's safe to say that while the Input and Output nodes are inherent, as they're linked to biological stimuli, the Hidden nodes are dependent on their weights (determined by genetic makeup) and therefore may or may not even exist.

And now the main question: At the start of mankind, were people born with a sense of morals?

Imagine 100 lions on an island.  Ten lions are downright faulty and have no affinity for food.  Another ten are also faulty and have no affinity for reproduction.  That leaves us 80 lions left.

The simplest way to "solve" this situation is to have each lion act as an individual entity and prioritize its own survival greatly over anything else.  That means that the weights associated with "find food" and "find a mate" are very high, and will easily trigger responses such as "attack enemy lion" and "guard food" to secure such priorities.

In a perfect world, this will lead to 80 dead lions.

Now let's assume God wants to try again and drops in 100 more lions, scrambling their parameters once again.  Maybe he messes up a little, and this time, 20 die of starvation and 15 die without children.  5 succumb to alcohol poisoning, I dunno.  That leaves us with 60 lions left.

Of course, after an infinite number of iterations of scrambling weights and dropping lions, one is bound to get smart.  One lion realizes that as a team, the whole is greater than the sum of its parts, and therefore the optimal strategy for survival would be to form a pact.  Of course, this requires the lions to sacrifice their food/mate priorities.  They'll have to share food, and protect each others' kin.  This clearly goes against each lions' individual interests, so 45 lions stay as lone wolves.  15 join the pact, because their genetic makeup provides less resistance to such a pact (maybe their food/mate weights are lower than the others, by random chance).

Success shows through and the 15 lions in the pact triumph over the other 45 lions.  But soon, civil war erupts among the Lion Kingdom and 10 are killed.  The remaining 5 reproduce and their children's genes match more closely with their own.  Hence, the food/mate importance weights of the children are lower, making them more likely to join a pact, and making them more likely to survive and pass on their genes.

If we define morality as "the ability for one to work symbiotically with a society," then a society aligned with inherent genetics leads to morals aligned with inherent genetics.  But never were things like "sharing food" or "treating others fairly" embedded into the lion psyche.  They were emergent solutions to the common problem of survival; the most cordial lion wins.

So what does this mean?

Morals are what you'd call an emergent solution -- one that's not specifically specified but one that works anyways.  The only reason we consider morality to be fundamental today is because all of the really immoral people are dead.

But that means humans are... robots?

You could call it that yes.  That also means that sentient AI is fully possible, on a side note.

Hope you enjoyed this, goodnight.

1 comment:

  1. Hmm, I had never really thought about it that way. Nice post.

    ReplyDelete