Robotic agents that interact with humans and perform complex, everyday tasks in natural environments will require a system to autonomously organize their behavior. Current systems for robotic behavioral organization typically abstract from the low-level sensory-motor embodiment of the robot, leading to a gap between the level at which a sequence of actions is planned and the levels of perception and motor control. This gap is a major bottleneck for the autonomy of systems in complex, dynamic environments. To address this issue, we present a neural-dynamic framework for behavioral organization, in which the action selection mechanism is tightly coupled to the agent's sensory-motor systems. The elementary behaviors (EBs) of the robot are dynamically organized into sequences based on task-specific behavioral constraints and online perceptual information. We demonstrate the viability of our approach by implementing a neural-dynamic architecture on the humanoid robot NAO. The system is capable of producing sequences of EBs that are directed at objects (e.g., grasping and pointing). The sequences are flexible in that the robot autonomously adapts the individual EBs and their sequential order in response to changes in the sensed environment. The architecture can accommodate different tasks and can be articulated for different robotic platforms. Its neural-dynamic substrate is particularly well-suited for learning and adaptation.