• Tudsamfa@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    7 minutes ago

    You can “simulate” life inside your brain, too.


    [Alt text: this is Bob. Bob is a figment of you imagination. When you leave, Bob will leave too. “Don’t leave” says Bob]

    The Bob in your head is intelligent, it can communicate in English. Is it unethical to stop thinking about Bob? Was it unethical of me to show you this picture, creating a “Bob” in your head? Is any story unethical to tell?

  • MousePotatoDoesStuff@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    38 minutes ago

    I’d imagine there could be an ethical way to do so through a sunset protocol similar to the concept of rapture (the religious kind, not the Bioshock city) - freeze simulation, move all the beings’ minds to “heaven”, shut down physical universe simulation (lowering operation costs by at least five orders of magnitude, I’d imagine), and let them enjoy afterlife until they get tired of existing, reach nirvana, or something like that.

    That reminds me, I should really get back into AI research.

  • Labna@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    3 hours ago

    There is a tv film, i don’t know the title, related about this topic.

    The plot was :

    a group of scientists made a living simulation, and go in the simulation to operate fixes and prevent making simulation. On day, one the scientific was killed, and left a message in the simulation for their coworkers. The message was : “take a road and follow no direction”, a guy in the simulation followed the instruction and discovered that he was in a simulation, but the message were for the scientists who are in a simulation too.

    If someone can find the movie, it could be great.

  • JayDee@lemmy.sdf.org
    link
    fedilink
    arrow-up
    3
    ·
    5 hours ago

    The ethics which we use today evolved out of practical ethics - that is to say, it’s evolved out of a need for a set of rules meant to be applied in order dictate the conduct of humans amongst one another. Because of this, I think most ethical frames of reference are ill-suited for trying to answer this question soundly.

    It seems analogous to trying to apply traditional physics to a quantum reference frame. It’s outside traditional Physics’s wheelhouse. A different set of tools likely needs to be applied, which has a different starting paradigm.

    That being said, your answer is really going depend on what this new ethic’s paradigm is, which is arguably completely arbitrary in this specific case.

  • reksas@sopuli.xyz
    link
    fedilink
    arrow-up
    1
    ·
    4 hours ago

    only way to know would be to enter the simulation and see for yourself… wait a minute…

  • Cethin@lemmy.zip
    link
    fedilink
    English
    arrow-up
    6
    ·
    6 hours ago

    The fun thing about ethics is that not everyone shares the same rules. Personally, I would probably say it is. (Though is more worse than what we do to cows? Or what we do to other humans in war?) However, others may say they aren’t real, and only an illusion manufactured by the simulation, so it’s fine. There are other arguments I’m sure someone could make too. It’s up for you to decide what your ethics are, not others. There is no universal code of ethics just as there is no universal morality.

  • Jankatarch@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    8 hours ago

    If you are a human, human ethics of not killing “alive” stuff still applies to you no?

    Thinking more into rules of ethics, if those simulated beings came up with their own morals like “don’t try calculating all digits of pi in large groups because it causes lag” that would not really apply to you.

    Basically different beings have different rules of ethics IMO and you can’t simply end the simulation more so because you are a human than anything.

    The answer could change in same exact scenario if you are some kind of eldritch being instead of human.

  • you_are_dust@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    13 hours ago

    If this is a way for our simulation creator to decide to pull the plug without guilt, I guess just go ahead and do it. I was holding out hope that this was all real, but it has been getting more clear that it’s not.

  • OwOarchist@pawb.social
    link
    fedilink
    English
    arrow-up
    20
    ·
    16 hours ago

    Just turn down the simulation speed real low and run it at one tick per 20 years, then you can technically keep it going without such great expense. The people inside won’t notice the difference.