9 Hollywood Memoirs That Reveal The Dark Side Of The Entertainment Industry

In two and a half months since new broke of Harvey Weinstein's decades of sexual harassment, assault, and abuse, dozens and dozens of stories about sexual misconduct in Hollywood have come to light. While there are plenty of new reporting about the immorality of Tinseltown to read everyday, there even more insightful memoirs that reveal the dark side of Hollywood and its influence in pop culture.

For decades, the Hollywood elite have rested comfortably on pedestals above the legions of fans who obsess over every little thing stars do, from what they wear to how they eat to who they date and beyond. Recently, however, attention has shifted from the trivial details of a famous person's life to their dirtiest secrets as more and more A-listers are revealed to have troubling histories with sexual harassment, assault, and abuse. From Kevin Spacey and Louis C.K. to Nick Carter and Danny Masterson, scores of powerful men have been accused of misconduct, a trend that reveals a more disturbing truth about the industry's systematic abuse of women, and our culture's acceptance of it.

Although the public seems to just be waking up to the dangerous reality of our pervasive rape culture, Hollywood stars have been sharing the real details about the dark side of Tinsel Town since the publication of the earliest celebrity memoirs. In their sometimes juicy, sometime trivial, sometimes eye-opening autobiographies, famous figures aren't afraid to dish about what really goes on behind the scenes of your favorite movies and TV shows, right down to the racism, sexism, bigotry, and abuse.

If the recent news cycle has inspired you to learn about what Hollywood is really like, check out these nine memoirs that reveal the dark side to fame and fortune.