A new book - Superintelligence: Paths, Dangers, Strategies - says yes, or least says it's very possible. The books author, Nick Bostrom, is a professor at Oxford University and Director of their Future of Humanity Institute.
Bostrom's area of study is existential threats to humanity and he spends his time studying "human extinction scenarios and related hazards".
This is either the coolest or most depressing job around, maybe both.
Unlike most science fiction stories, Bostrom doesn't suggest evil superintelligent machines will destroy mankind out of malice or ill intent.
Instead, mankind will just get in the way and use resources the machines want for other purposes. So the machines will take us out.
This is kind of the ultimate version of "it's nothing personal, just business" with the superintelligent AI just going about its business.
A more milder man verus machines debate is also going on these days. This is whether or not machines are going replace more human jobs than they create.
The Economist weighs in on this debate with a special section called The Third Great Wave. After reading about machines wiping out mankind, just stealing our jobs doesn't sound so bad.