From the 1980s movie Firefox to the more recent Avatar, popular science fiction has speculated about the possibility of a persons thoughts being read directly from his or her brain. Such braincomputer interfaces (BCIs) [1] might allow people who are paralyzed to communicate with and control their environment, and there might also be applications in military situations wherever silent user-to-user communication is desirable [2]. Previous studies have shown that BCI systems can use brain signals related to movements and movement imagery [3] or attention-based character selection [4]. Although these systems have successfully demonstrated the possibility to control devices using brain function, directly inferring which word a person intends to communicate has been elusive. A BCI using imagined speech might provide such a practical, intuitive device. Toward this goal, our studies to date addressed two scientific questions: 1) Can brain signals accurately characterize different aspects of speech? 2) Is it possible to predict spoken or imagined words or their components using brain signals?