I can understand the OP's original question and (potential) confusion, and I share in much of it, despite my attempts to research and understand this topic. The link to the wiki isn't exactly helpful, especially if you lack any background on how color mapping works, what HDR is actually doing, or how these things might be impacted by anything from the original source material to chroma subsampling, your system's hardware, the HDMI (or other video IO port) version being used and compatible cables, etc. Needless to say, there's a steep learning curve, and the wiki doesn't help you climb it.
I am still very much learning about this topic myself, but in an attempt to answer the OP's question, the setting simply toggles your display's (or system's) HDR mode on and off depending on whether the underlying source material is HDR or not--more or less exactly what the parenthetical in the setting's description says. For example, if you are running Kodi on Windows OS, and your Windows display settings have "Use HDR" set to Off, then enabling the Kodi option will effectively toggle the Windows setting to On when you play an HDR video (and turn it back off when you stop playback). If you leave the Kodi setting disabled, then either:
(a) If your Windows setting is set to HDR on, then Kodi will play the HDR content and will simply send the HDR color info directly as-is to your display (if your display is not HDR-capable, your display will probably do its own tone-mapping to convert the HDR colors to SDR colors); or
(b) If your Windows setting is set to HDR off, then Kodi will attempt to translate the HDR color info into the closest SDR color value, which is then sent to your display. There are additional settings within Kodi to allow you to use custom tone mapping details. This is above my head.
As hinted at in my intro, there is more to the equation that can trip you up here, depending on your setup. For instance, even if you have an HDR-capable display (eg, a 4k HDR TV), every point along the chain needs to be compatible. So, if your system has an old HDMI 2.0 or earlier port (or if you are using an HDMI 2.0 or earlier cable), then your bandwidth will be limited to 18Gbps or whatever HDMI standard is the bottleneck, so Windows might not actually be able to take Kodi's instructions. (4K resolution at 60 Hz refresh using HDR with 10- or 12-bit color depth and no chroma subsampling compression requires ~20-24 Gbps of bandwidth (
Bandwidth Calculator), so unless you have HDMI 2.1 across your entire system, there will need to be some compression somewhere. Windows might default to either a lower color bit depth or a lower refresh rate.
I've also come into issues on some content sources where enabling this Kodi setting actually produces horrible washed-out results for what I believe is HDR content (based on the metadata / codecs as far as I can tell). Other HDR content plays fine, so maybe there are specific encodings or other ways to identify the issue, but I haven't figured this out yet. Enabling "Use HDR" from within the Windows display settings makes all SDR content look washed out (and makes text on white backgrounds look especially horrible), and it does not help certain HDR content, so if someone more knowledgeable than me can shed some light, I'd appreciate it!
Hope this helps.