Spartacus is our robot entry in the 2005 AAAI Mobile Robot Challenge, making a robot attend the National Conference on Artificial Intelligence. Designing robots that are capable of interacting with humans in real-life settings can be considered the ultimate challenge when it comes to intelligent autonomous systems. One key issue is the integration of multiple modalities (e.g., mobility, physical structure, navigation, vision, audition, dialogue, reasoning). Such integration increases the diversity and also the complexity of interactions the robot can generate. It also makes it difficult to monitor how such increased capabilities are used in unconstrained conditions, whether it is done while the robot is in operation of afterwards. This paper reports solutions and findings resulting from our hardware, software and decisional integration work on Spartacus. It also outlines perspectives in making intelligent and interaction capabilities evolve for autonomous robots.