I don’t think there’s a need to collect this information because you made the product and should know the specs of the chips you used already. It might be helpful to the user to know those specs to help them choose which adapters to buy. Sometimes I think there’s a pact between the chip makers and adapter makers to make the reported specs as vague as possible or that might be the fault of the online stores…
The 1080p or 1200p displays you refer to are probably LCD type displays or similar fixed resolution displays. I guess you want to test with those to ensure the pixels from the VGA adapter match with the pixels on the LCD display with minimal blurring into neighbouring pixels. But if you’re going to make a VGA adapter, then testing it on a CRT type display that allows variable resolutions and refresh rates might be a good idea as well.
I’ve done some further testing with the USB-C to VGA adapter. The Intel Graphics Control panel in Windows allows creating custom resolutions without restarting the computer. Pixel clocks as high as 330 MHz worked without issue. There is no signal when attempting to use pixel clocks higher than that. My CRT displayed those resolutions with vary little interference. The intensity of a vertical line test pattern varied (without movement) from left to right but that may have to do with the pixel width shrinking with higher pixel clocks and approaching the dot pitch of the phosphors. I don’t know what the dot pitch of the phosphors is. The EDID of the CRT says 370 MHz max for the pixel clock but I don’t think pixel clock range is meaningful for CRTs unless it relates to the dot pitch of the phosphors?
In Ubuntu, I looked at the DPCD info from the adapter by dumping the contents of /sys/kernel/debug/dri/0/DP-1/i915_dpcd
I believe it says the adapter uses two DisplayPort 1.2 lanes at HBR2 data rate each. This would allow a max pixel clock of 360 MHz (only a little higher than the DAC allows) for 24 bit pixels and 288 MHz for 30 bit pixels but I didn’t try 30 bit pixels. I don’t know how many bits per pixel the adapter will allow as input, and how many bits per pixel are actually converted to analog?
Anyway, I think 330 MHz is quite nice and you guys should mention that in your product specs (if you want to confirm that it’s the limit). Max pixel clock is a very concise way to state the limit. It’s much simpler than choosing arbitrary GTF or CVT or CVT-RB 5:4 or 4:3 or 16:9 or 16:10 30Hz, 60Hz, 75Hz, 85 Hz, 120Hz, timings, resolutions, and frequencies. But I guess you could list a good representative set of those as examples, including ones with large resolutions and others with high refresh rates.