Roku's basilisk just doesn't make sense to me because any semi-competent AI would be able to tell that it is not punishing the people that failed to help create it it's just wasting energy punishing a simulacrum.
We are not going to suddenly be teleported into a future of torment. If the AI had the ability to pluck people out of the past it should have no reason to waste it on torture porn.
Then AI already exists and you have no memory or recollection of either helping to create it or accidentally contributing to its non-creation and therefore you being tormented by the AI would serve no moral purpose.
Any torture you would be experiencing in that simulation would simply be that the AI desires to torture, and you happen to be one of its victims.
Any person alive during the time when the Basilisk is being created is at risk. Also, if you create a good AI instead, then you didn't help build the Basilisk so if anyone else does, you're screwed.