We propose a new framework for extracting information from extrinsic muscles in the forearm that will allow a continuous, natural and intuitive control of a neuroprosthetic devices and robotic hands. This is achieved through a continuous mapping between muscle activity and joint angles rather than prior discretisation of hand gestures. We instructed 6 able-bodied subjects, to perform everyday object manipulation tasks. We recorded the Electromyographic (EMG) and Mechanomyographic (MMG) activities of 5 extrinsic muscles of the hand in their forearm, while simultaneously monitoring 11 joints of hand and fingers using a sensorised glove. We used these signals to train a Gaussian Process (GP) and a Vector AutoRegressive Moving Average model with Exogenous inputs (VARMAX) to learn the mapping from current muscle activity and current joint state to predict future hand configurations. We investigated the performances of both models across tasks, subjects and different joints for varying time-lags, finding that both models have good generalisation properties and high correlation even for time-lags in the order of hundreds of milliseconds. Our results suggest that regression is a very appealing tool for natural, intuitive and continuous control of robotic devices, with particular focus on prosthetic replacements where high dexterity is required for complex movements.