Shouldn’t be enums as refresh rates can be floating-point and in practice there also is a lot of weirdness out there, like 59.94Hz.
The timing really needs to be matched to the monitor, you don’t want a 60Hz monitor using the resources of a 1000Hz monitor at any point. It should also be handled by the gpu and gpu driver more than the os.
I don’t think it’s that easy and I struggle to think of a legitimate reason. To me it seems more like an arbitrary bounds-check on monitor info received via hdmi/displayport. Bad coding for sure, but also potentially a point where people are pushed to newer more problematic versions of windows as the older ones “don’t support new hardware”.
AnyOldName3@lemmy.world 1 hour ago
If there are potentially buggy or broken monitors that sometimes report the wrong value, then a bounds check that enforces sane values makes sense. If the range of sane values changes decades later, then you’ll have to update things, but you’ll likely need to update other things on that timescale anyway, e.g. to support newer display connectors that support the new limits.
Redjard@reddthat.com 1 hour ago
I’d expect any current displayport port to handle very high refreshrates when the resolution is reduced correspondingly. The limit to my knowledge is in bitrate.
I’d also expect connector support to sit in the gpu driver.
A basic sanity-check might be the answer though. Still why not improve it instead of just increasing the number? You could check if the rate is an outlier or there are many profiles offered that climb up to that rate for example.
AnyOldName3@lemmy.world 41 minutes ago
Either you’d be accessing the internet to query which monitor parameters are sensible each time a monitor connects, or you’d be periodically updating a list of sensible monitor parameters which is exactly what this update was.