This paper explores reduced complexity physical implementation of self-organizing-map (SOM) and LAMSTAR (Large Scale Memory Storage and Retrieval) neural network. Unique Gaussian IDS-VGS characteristic of emerging gate/source-overlapped heterojunction Tunnel FET (SO-HTFET) is utilized to simplify the complexity of a SOM. For a given pattern, SO-HTFET-based SOM performs associative processing between the applied pattern feature and the stored neuron states. SO-HTFET reduces the SOM computing cell to just a single transistor. This is remarkable considering that a conventional digital SOM cell will require more than 100 transistors. IDS-VGS variance of SO-HTFET is modulated by varying its drain-to-source voltage (VDS). This enables dynamic adaptation of distance measures in SO-HTFET-based SOM. Various SOM-modules are combined in a LAMSTAR network with link weights to facilitate deep learning and integration of various features of the applied pattern in a decision making process. Electroencephalogram (EEG) classification is studied using SO-HTFET-based LAMSTAR. SO-HTFET enables a higher number of hidden neurons in LAMSTAR by reducing the complexity of SOM and thereby, improves classification accuracy than a conventional design. EEG classification accuracy is specifically evaluated for fixed neuron and dynamic neuron approaches. The optimal variance of SO-HTFET IDS-VGS is extracted for these approaches.