No, that's a version of the Basilisk that makes sense (almost – you don't need an AI for that). The original formulation was that the AI, built with the goal of [something good], would decide to torture people who didn't help build it so the threat of torture encouraged people-in-the-past to build it. (Yes, this is as nonsensical as it sounds; such acausal threats only work in specific scenarios and this isn't one of them.)
But yes, even if the Basilisk could make the threat credible (perhaps with a time machine), your strategy would still work. You can't be blackmailed by something that doesn't exist yet unless you want to be.
But yes, even if the Basilisk could make the threat credible (perhaps with a time machine), your strategy would still work. You can't be blackmailed by something that doesn't exist yet unless you want to be.