"I hope it is clear to you that you will eventually press the button."
nope, disagree. the person i am now maybe will press the button (but maybe not!), but with eternity? i strive to be better then that!
there are a bunch of things when i feel like society say to people "of course you will fail" and it's self-fulfilling prophecy. and when i read fictional description with society that is the opposite of that. that think that Buttons (not in your current meaning here) are serious business, they both desire society to avoid them, consider dealing with them important social role, and TEACH PEOPLE NOT TO DO THAT.
I strive to be the sort of person who do not damn themself to Hell, and expect that if i have the information, will manage to do that.
and i consider sentence like the first one as active sabotage.
I think that you're missing quite how long infinity is.
In general, I agree that it is important to become the sort of person who doesn't press buttons. But I do not believe that a human being can drive the probability of pressing that button all the way down to zero, and if you drive it down to a mere 0.00000000001 chance per year, a level of reliability significantly greater than essentially every human created system ever, then you still eventually press the button with probability 1.
see, i find it really hard to think about Hell without thinking about the glowfic version of Pathfinder. and there people become outsiders. i, as i currently am? probably yes, but it's nonsense hypothetical, i will CHANGE in such scenario. for every long enough period, i will change, because i am human and humans change. and being the sort of being that capable of True Commitment is one of the things i can do.
also, i really don't think it make sense to say "human" and "eternity" in the same sentence. we don't know what sort of being 10,000 hundred years old human will be. we didn't try it! i can accept, though disagree, believing that 90% or maybe even 99% of the people will press the button (although it look unjustifiably pessimistic to me), but EVERY PERSON? this is astonishing amount of certainty for situation so out of distribution.
also, chance-per-year is not independent variable. i expect that if someone managed for 10,000 years their chance will be very low. maybe close to zero. and by that point they have so much time! they could have invented so many things, optimized so hard on "don't press the button". they may at that point have AI they created with the sole goal of preventing people to press on buttons.
Are you at all familiar with the work of John Vervaeke? Your metaphor about the sirens and lotuses (loti?) is a good one. However, particularly the last sections about how sometimes you benefit from listening to the sirens, as well as your emphasis that we have to walk through the lotus fields every moment of every day, sounds startlingly convergent with a phrase that John often repeats - "the very things that make you adaptively intelligent, also make you perennially prone to self-deception and self-destruction". There's a great deal going on in this phrase, to do with the complexity of reality and our finite ability to cope with it. But I can't help but note the similarities here. The lotus field is a powerful metaphor.
"I hope it is clear to you that you will eventually press the button."
nope, disagree. the person i am now maybe will press the button (but maybe not!), but with eternity? i strive to be better then that!
there are a bunch of things when i feel like society say to people "of course you will fail" and it's self-fulfilling prophecy. and when i read fictional description with society that is the opposite of that. that think that Buttons (not in your current meaning here) are serious business, they both desire society to avoid them, consider dealing with them important social role, and TEACH PEOPLE NOT TO DO THAT.
I strive to be the sort of person who do not damn themself to Hell, and expect that if i have the information, will manage to do that.
and i consider sentence like the first one as active sabotage.
I think that you're missing quite how long infinity is.
In general, I agree that it is important to become the sort of person who doesn't press buttons. But I do not believe that a human being can drive the probability of pressing that button all the way down to zero, and if you drive it down to a mere 0.00000000001 chance per year, a level of reliability significantly greater than essentially every human created system ever, then you still eventually press the button with probability 1.
see, i find it really hard to think about Hell without thinking about the glowfic version of Pathfinder. and there people become outsiders. i, as i currently am? probably yes, but it's nonsense hypothetical, i will CHANGE in such scenario. for every long enough period, i will change, because i am human and humans change. and being the sort of being that capable of True Commitment is one of the things i can do.
also, i really don't think it make sense to say "human" and "eternity" in the same sentence. we don't know what sort of being 10,000 hundred years old human will be. we didn't try it! i can accept, though disagree, believing that 90% or maybe even 99% of the people will press the button (although it look unjustifiably pessimistic to me), but EVERY PERSON? this is astonishing amount of certainty for situation so out of distribution.
also, chance-per-year is not independent variable. i expect that if someone managed for 10,000 years their chance will be very low. maybe close to zero. and by that point they have so much time! they could have invented so many things, optimized so hard on "don't press the button". they may at that point have AI they created with the sole goal of preventing people to press on buttons.
Are you at all familiar with the work of John Vervaeke? Your metaphor about the sirens and lotuses (loti?) is a good one. However, particularly the last sections about how sometimes you benefit from listening to the sirens, as well as your emphasis that we have to walk through the lotus fields every moment of every day, sounds startlingly convergent with a phrase that John often repeats - "the very things that make you adaptively intelligent, also make you perennially prone to self-deception and self-destruction". There's a great deal going on in this phrase, to do with the complexity of reality and our finite ability to cope with it. But I can't help but note the similarities here. The lotus field is a powerful metaphor.