Jodie Foster on sexism in Hollywood: I believe we're getting better
Jodie Foster on sexism in Hollywood: I believe we're getting better

Actress Jodie Foster speaks with CNN’s Christiane Amanpour about sexism in Hollywood and how she chooses her roles in films.