I feel like every fiction book I pick up that's been published in the last 5-10 years that's deemed "feminist" has a very... underwhelming and surface-level approach to feminist themes. Pretty much any book with a female protagonist and male antagonist or a "sapphic" subplot ends up labeled as "feminist" these days. Not to mention, if it's at all a historical setting, there's so many anachronisms in these in particular because they'd rather appear feminist at the detriment of any thought of what an actual feminist of the time period in question might be like. I think this is mainly a result of when "feminism" was less a political movement and more a pop culture trend in the mid-2010s, and most authors writing these books being libfems.
What I want to read more is fiction that isn't necessarily "feminist" outright or by marketing, but that has even subtle themes that could be considered feminist that are actually approached with depth and nuance rather than the simultaneously heavy-handed and shallow version that these books spoon-feed to their audience, because I feel like those books are insulting my intelligence.
I'd love to hear recommendations for books and authors! They don't have to be recent, and I'm willing to read from any genre!