Human manual dexterity relies critically on touch. Robotic and prosthetic hands are much less dexterous and make little use of the many tactile sensors available. We propose a framework modeled on the hierarchical sensorimotor controllers of the nervous system to link sensing to action in human-in-the-loop, haptically enabled, artificial hands.