Then these websites should block the ability of search engine crawlers from accessing their paywalled content, which is basically moving the content out of the clear web onto the deep web (which is not the dark web, there's a difference).
If they can't accept that compromise and not have these paywalled articles indexed, then the problem is on them.
They can even have a summary or introductory paragraph indexed while designing a paywall that way. What they can't have is search engines able to read the whole article and not everyone else.
I think this is fair, and so do the search engines. Google calls doing otherwise "cloaking" and says they penalize the ranking of sites that do it. Perhaps they're not doing so effectively enough.
If they can't accept that compromise and not have these paywalled articles indexed, then the problem is on them.