Multimodality inputs are becoming increasingly popular in supporting pervasive applications, due to the demand for highly responsive and intuitive human control interfaces beyond the traditional keyboard and mouse. However, the heterogeneous nature of novel multimodal input devices and the tight coupling between input devices and applications complicate their deployment, rendering their dynamic integration to the intended applications rather difficult. i∗Chameleon exploits device abstraction in a web services-based framework to alleviate these problems. Developers can dynamically register new devices with the i∗Chameleon framework. They can also map specific device inputs to keyboard and mouse events efficiently. A number of input modalities such as tangible devices, speech, and finger gestures have been implemented to validate the feasibility of the i∗Chameleon framework in supporting multimodal input for pervasive applications.