This is probably fruitless since no one's reading this anyway, but I'd like to put a note by the Posting Tree. If Neil Gaiman can do this, why can't I? (oh, because I suck, that's right). I'm looking for a book about today's drug culture - specifically people taking medicines for diseases they don't have, the direct-to-consumer brand name drug marketing, the sudden wave of anti-depression medication everyone's taking, things like that. It'd be especially bubbly if it had a 'what's wrong with you people?!' tone to it. It's research for the new novel I'm working on.