Second monitor not recognized after boot, connected to new laptop
I just bought new Asus laptop model N750JK. I've tried to connect it to my external monitor via HDMI port (HDMI->DVI adapter). The second monitor displays image on the screen when I start the PC and it's there until Windows desktop is about to show (when windows logo dissapears). I can't get it to work no matter what. (I've tried detect method in display properties, without success).
Then I've downloaded latest Intel HD 4600 drivers and it worked. Image was showing on laptop AND external monitor. But when I made the restart of the laptop, the situation was the same as before. Second monitor couldn't be recognized when in windows.
I'm using windows 8.1 and I've tried on windows 7 also.
Model: Asus N series N750JK
Laptop graphic card: Graphic card: Intel HD 4600 (this one is primary - can't change), Geforce GTX 850M. I also tried with HDMI - HDMI cable via another monitor and it has the same behaviour as described above.
Any help will be much appriciated.
EDIT: I would like to know from first hand if somebody has the same laptop as me (or at least same integrated graphic card), if experience the same problems as me..
EDIT 2: I've tried Linux distribution (FEDORA), and it WORKS like a charm right from startup!
So what could be wrong in windows? It seems this is not hardware issue..
From your description it sounds like it's a software/driver issue, as the display is receiving input up until the point where the driver is loaded by Windows.
I would suggest the following:
- Try Win + P and set it to extend desktop to see if that has any effect. Could be Windows is simply not sending any output to HDMI.
- Try reverting to the default driver (uninstall the HD 4600 driver, remove the GPU device from Device Manager and then reboot). Test if anything is changed, then install the latest driver.
- Try disabling the Intel HD 4600 in BIOS, if possible to see if this helps the issue.
Two solutions mentioned here:
First, I'll propose the idea which may give you some instant relief. I'm proposing this since you said that disabling the device from the Device Manager, and then re-enabling it, caused things to work better for you.
Check out NirSoft's Device Manager View. NirSoft's DevManView should be able to accomplish the process I'm about to describe.
Another alternative may be Microsoft's DevCon. At least in theory, that may also work, so I'm providing it as a possible resource which may be able to accomplish the same thing. Here's a few hyperlinks: Info: MS Q311272. Download: part of DDK: DDK. Possible direct download: devcon.exe. More info: TechNet. More info: Rob.
DevManView may allow you to disable and re-enable a device. Try to disable the device. Then, wait a sufficient length of time, and re-enable the device from the command line. (The length of time you wait might effectively be voodoo magic; try waiting a second or two; if that fails try waiting a minute or two, and fine-tune as needed. The point of the wait is simply waiting for Windows to fully realize that the device is disabled, and giving Windows a chance to respond to that change in whatever way Windows thinks it needs to do. Possibly no time at all is required.)
If this works manually, see if this works well from a batch file. (Windows Script Host could be used to create an artificial delay between commands, if needed.) Once you have this working well from a batch file, you can assign an icon on the desktop, which is surely more convenient then starting to open up the graphical Device Manager interface, waiting for it to open fully, and then manually interacting with it. Even better: you could have that batch file start every time the system starts up (or every time a user logs in; whatever's needed). So then your workaround is automatic.
Note: Since you said this was a laptop, before you celebrate success, try suspending/hibernating. Hibernating/suspending can be a bit fragile with these sorts of things. See if the video works right after restoring the system from its sleep. If so, great. If not, you may have some further work to do. Better to know about that now, rather than later.
Now, in the back of your mind, you'll be thinking: this workaround seems to work great, but it's rather unfortunate that it's even needed...
Right. So, let's look at that briefly. The problem is clearly caused by Windows, as evidenced by the screen working in the BIOS and even during the early part of Windows starting. Note that when I'm blaming "Windows", I'm using that term "Windows" quite generically, referring to the entire experience; the issue might be caused by the Windows display driver model, or by drivers which may be written by the vendor of the graphics card (Intel).
In the long term, the nicest solution (which is the second solution I referred to earlier) might be to replace whatever part is broken. In practice, I've found that a bit of experimentation may be needed to accomplish that. The Windows driver display model can only be effectively replaced by replacing the Windows software that is used. (For instance, upgrading from XP to Vista made some major changes to that technology.) That is often impractical or undesirable for some reason. The other option is by replacing the drivers with another version that does work better. Unfortunately, finding and identifying those drivers (by testing different versions of the drivers) can be an unpleasant pain; even worse, there is the possibility that they don't exist yet. But if you keep checking for upgrades, you may find them someday. If you ever pull that off, then you will achieve the actual best solution, which is to use software where things actually work right as they ought to. In the meantime, you may just need to live with the workaround if you keep using the same laptop/operating-system/drivers.