Perhaps you're missing the context? In the incidents which led to this proposal, no judgment of similarity was necessary, the sexualized images were posted in the replies to non-sexualized images of the same person.
This isn't really a novel dimension in the first place, I don't think. It's just rarely an issue in practice, because most people who post these images do so to shame and embarrass the depicted person. No doubt there will be edge cases where a sexualized image of consenting person A gets taken down because they look similar to non-consenting person B - but is that really a big problem?
The law doesn't stipulate that the offending images have to be posted with the intent to shame or embarrass, nor that the images have to be sent directly to the person that's supposed to be depicted in the image. If that's the justification, then the legislators ought to have put wording to that effect into the law.
As you point out, this is a proposed amendment to an already existing law.
> The government said: "Plans are currently being considered by Ofcom for these kinds of images to be treated with the same severity as child sexual abuse and terrorism content, digitally marking them so that any time someone tries to repost them, they will be automatically taken down."
Unless I'm mistaken, CSAM is prohibited entirely in the UK, not just in replies to the child depicted in the abusive imagery. They explicitly say that they intend to make fictional intimate content allegedly depicting a real person to be treated the same way as CSAM.
There's nothing that suggests to that this new amendment prohibiting fictional content is going to be narrowly scoped to replies to the people allegedly being depicted.
This isn't really a novel dimension in the first place, I don't think. It's just rarely an issue in practice, because most people who post these images do so to shame and embarrass the depicted person. No doubt there will be edge cases where a sexualized image of consenting person A gets taken down because they look similar to non-consenting person B - but is that really a big problem?