I have been reading all of my life, and I even read some Harlequin Romances when I was a teenager. I tend to lean toward mysteries or true crime, and there isn't much sex in those, so maybe this has been going on for a long time and I just didn't notice. I've read a few romances over the past few years, and have noticed the authors seem to spell everything out with regard to sex. I think it's an overall trend in society, because I've noticed the same thing with movies, and I don't like it there, either. I have a perfectly good imagination, and don't really need or want someone detailing every position and every bodily response when it comes to sex. Has anyone else noticed this? I don't read many of what I consider romance books, and I'm not squeamish or a prude, but I find this to be a turn off.