I have an AMD RX6600XT 8GB GPU, which is connected to an old Samsung SyncMaster 2343NWX monitor.
Monitor input is VGA, and card output is HDMI. I'm using a HDMI splitter (connected to an optional 5V power source, which I use USB from the motherboard) and at its output there is a HDMI to VGA adaptor.
Monitor native resolution is 2048x1152, and it works pretty well with this setup. Except for the folowing:
The problem I am facing is that when I turn off the computer and turn it on again, it changes the resolution to 1920x1080 without the option to set 2048x1152.
If I unplug the HDMI cable and plug ir back again, it automatically connects my monitor and set the resolution to 2048x1152.
What I would like to do is to force Windows to set the 2048x1152 after each boot, or how other way can I troubleshoot this situation.
Once I get the desired resolution, monitor works perfectly for hours. Any idea?
Monitor input is VGA, and card output is HDMI. I'm using a HDMI splitter (connected to an optional 5V power source, which I use USB from the motherboard) and at its output there is a HDMI to VGA adaptor.
Monitor native resolution is 2048x1152, and it works pretty well with this setup. Except for the folowing:
The problem I am facing is that when I turn off the computer and turn it on again, it changes the resolution to 1920x1080 without the option to set 2048x1152.
If I unplug the HDMI cable and plug ir back again, it automatically connects my monitor and set the resolution to 2048x1152.
What I would like to do is to force Windows to set the 2048x1152 after each boot, or how other way can I troubleshoot this situation.
Once I get the desired resolution, monitor works perfectly for hours. Any idea?