• finitebanjo@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    12 hours ago

    My morality is built on furtherment of mankind technologically, with weights assigned to satisfaction and an aversion to harm. Here are some examples on how to apply this logically and without any emotion, empathy included:

    • It’s kind of like not really believing in human rights but supporting them anyways because the people who oppose human rights are destructive and inefficient.
    • Humans are animals. We must act according to our basic wants and needs in a way that maximizes our satisfaction, or else we are acting against our own nature. However, we must do this in a way that causes no harm, or we have failed as a collective species.
    • Diversity is a must because exclusivity is a system which consistently fails every time is has ever been tested.
    • The death penalty is taboo not because life is sacred but because one person deciding the importance of another’s life is intellectually bankrupt and only leads to a spiral of violence.
    • All life is meaningless, full stop, which gives us the right to assign whatever meaning we like, and having more technology, with equal control over it by each individual person, gives us the collective power to make more choices.

    I will not be taking any questions, meatbags

    • themeatbridge@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      55 minutes ago

      So, empathy like I said.

      Why do you value the technological advancement of the human race? How do you determine what is advancement, and what is regression?

      Why place emphasis on satisfaction and aversion to harm? How do you determine the relative levels of satisfaction and harm except through empathy?