It’s basically the same as this topic. Google is the only judge of what should be indexed or not. Relying on tricks to prevent the dumb AI from crawling and/or indexing the wrong content is a must in this day and age.
From my experience, stripping S&D parameters and reintroducing them client-side works pretty well. Is this ideal, not really. But Google crawlers really are dumb sometimes. And once the damage is done, it can hurt.