With the recent introduction of High Dynamic Range (HDR) and Wide Color Gamut (WCG) technologies, viewers' quality of experience is highly enriched. To distribute HDR videos over a transmission pipeline, color pixels need to be quantized into integer code-words. Linear quantization is not optimal since the Human Visual System (HVS) do not perceive light in a linear fashion. Thus, perceptual transfer functions (PTFs) and color pixel representations are used to convert linear light and color values into a non-linear domain, so that they correspond more closely to the response of the human eye. In this work, we measure the visual color differences caused by different PTFs and color representation with 10-bit quantization. Our study encompasses all the visible colors of the BT.2020 gamut at different representative luminance levels. Visual color differences are predicted using a perceptual color error metric (CIE ΔE2000). Results show that visible color distortion can already occur before any type of video compression is performed on the signal and that choosing the right PTF and color representation can greatly reduce these distortions and effectively enhance the quality of experience.