Nick Bostrom estimates that the future populations of earth will contain 100 trillion potential persons should we avoid any existential threats to human existence.
if we focus on stopping a genocide of 10 million people here and there than it could actually distract from us focusing our entire effort on avoiding existential threats to our existence. even a 1 in a million chance of a super AI destroying our existence in the famous grey goo scenario, we should still focus more heavily on preventing that potential over stopping a genocide of 10 million people.
A genocide of 10 million people is a loss of 10 million. so expected loss of 10 million, while a 1 in a million existential threat would have and expected loss of 100,000,000
so ev of genocide is -10 million
and ev of existential threat is -100 million
Keep in mind that 100 trillion is a conservative estimate if we don't look at human-level AI. So if we keep this in mind then it is unethical to waste time stopping a genocide when we could be hit by a meteor that destroys earth and potentially kills hundreds of trillions of people in future generations that do not yet exist. by removing any modern worry and only focusing the governments on eliminating existential threats to future generations, we may save trillions of lives and it is very irresponsible to allow things like preventing genocide distract us from saving lives.