The paper introduces an electroencephalography U+0028 EEG U+0029 driven online position control scheme for a robot arm by utilizing motor imagery to activate and error related potential U+0028 ErrP U+0029 to stop the movement of the individual links, following a fixed U+0028 pre-defined U+0029 order of link selection. The right U+0028 left U+0029 hand motor imagery is used to turn a link clockwise U+0028 counterclockwise U+0029 and foot imagery is used to move a link forward. The occurrence of ErrP here indicates that the link under motion crosses the visually fixed target position, which usually is a plane U+002F line U+002F point depending on the desired transition of the link across 3D planes U+002F around 2D lines U+002F along 2D lines respectively. The imagined task about individual link U+00BC s movement is decoded by a classifier into three possible class labels: clockwise, counterclockwise and no movement in case of rotational movements and forward, backward and no movement in case of translational movements. One additional classifier is required to detect the occurrence of the ErrP signal, elicited due to visually inspired positional link error with reference to a geometrically selected target position. Wavelet coefficients and adaptive autoregressive parameters are extracted as features for motor imagery and ErrP signals respectively. Support vector machine classifiers are used to decode motor imagination and ErrP with high classification accuracy above 80 U+0025. The average time taken by the proposed scheme to decode and execute control intentions for the complete movement of three links of a robot is approximately 33 seconds. The steady-state error and peak overshoot of the proposed controller are experimentally obtained as 1.1 U+0025 and 4.6 U+0025 respectively.