Several studies have been attempted on human walking assistance using exoskeleton robots. To achieve the effective walking assistance with a variety of user motions, the robot behaviors need to be coordinated with both predicted user motions and the environment spatiotemporally. In this paper, we study how movement prediction and temporal synchronization can be beneficial for walking assist exoskeletons using the framework of style-phase adaptive pattern generation [1]. In particular, we empirically investigate the following two issues: i) mutual synchronization between a human subject and a humanoid model through style-phase adaptation, and ii) using style-phase adaptation for walking assistance. We developed two experimental platforms for the two investigations and conducted subjective experiments. The experimental results suggest that visual feedback of the state of the humanoid model can enhance the mutual synchronization through style-phase adaptation, and the estimated style and phase can be useful to assist walking movement of the human subject.