-->
@Discipulus_Didicit
Is there an example of a different proscriptive position that can be drawn from longtermism you could point to? One that maybe has a little more of a real world application?
Most of it is going to be avoiding existential threats to society. the theory is that future people are 10^45, which has a lot more zeros than the 7 billion current humans, so almost all policy should be about avoiding existential threats to mankind's continued existence. Like I argue in the debate and show the EV calculation. This ideology would make it acceptable to kill 80 million Germans if there was a mere 1 in a million chance, somebody in the country could succesfully create a doomsday device.
The existential threats longterminists are most concerned about is runaway super intelligent AI, and things that explain the Fermi paradox. SO no if one took control of the country they would not have policy positions on things like gay rights, economic policy etc. If they did it would come after herculean efforts to stop unlikely existential threats to mankind, particularly ones given as an answer for the Fermi paradox.
Maybe, sound like a bunch of unfuckable tech bros to me but I admit I am going off of limited information.
This is accurate other then the part where you claim tech bros are unfuckable.