Integrated graphics not working with GPU

I recently purchased a ASUS gtx 1050 and installed it. When plugged into the graphics card I get a blank screen, which is to be expected, as I have yet to install my drivers. To install my drivers I thought "I'll just plug into the on board graphics in my motherboard" this, however did not go as planned. The on board graphics also result in a black screen, it should also be mentioned that this black screen is not only present when in the operating system (Ubuntu 16.04), but the BIOS screen is also blank. I will admit that up to this point I have never used the on board graphics, as my old graphics card needed no drivers installed, but I don't see why they would not work. While my PC is on, every fan and light indicates that it is working correctly, as does the motherboard speaker. I've already checked, and it is neither a cord or monitor problem, so I don't understand why I can't use the on board graphics while I install the drivers.

In summary: My graphics card gets a black screen, which is normal, but so does my integrated on board graphics, which is not normal. As of right now I just want to get my on board graphics working so that I can install drivers, or so that my PC is at least functional.

Update: after removing the graphics card, the on board graphics are working now. I no longer have a problem as I will install the drivers and put it back in.

Things I can think to try:

Check all the cables to make sure everything is connected, try updating all the drivers (install windows with on board graphics if need be), try different outputs on the GPU, make sure you're plugging the monitor into the GPU and not the Mobo (sounds stupid but it happens a lot), try using the GPU in different PCI slots assuming you have them, maybe some other things but that's off the top of my head.

BUT, all that being said the sad truth is that this sounds exactly like what happened with my first GPU. I got a GPU a bit after building my PC and was using integrated graphics until then. I tried installing the 280x and all got absolutely no output. With the GPU still installed plugging into the Mobo did nothing either. The fans were spinning and it seemed to be working. However not until the GPU was removed would I be able to get any video output. In the end I came to the conclusion that the card was just DOA so I returned it for a 760. Plugged it in and it worked immediately. So what I'm saying is my guess is your card is dead. Try all the things I and others suggested, but realize that more likely than not you're probably going to have to return it.

That's not entirely accurate, you can if you want.

It doesn't plug'n'play like that, but by changing some settings you can use your integrated graphics as a passthru for your discreet GPU and keep your display cable plugged into the mobo.

You lose a bit of performance, but it's only like 2-5%.

Some people prefer the simplicity of it, and you can seamlessly set your profile to switch between the integrated graphics for regular use, and discreet graphics for gaming and specific apps, just like a gaming laptop does.

When a monitor isn’t working even though it’s connected to the motherboard, it can be a very distressing and frustrating experience. For those that don’t have any technical knowledge on the subject, things can get confusing very fast as to why things aren’t working the way they should be.

Connecting a dedicated GPU to your computer will disable the integrated graphic card, which is why your monitor that’s connected to the motherboard or CPU isn’t working. You have to enable the iGPU once again via the BIOS settings. It could also be caused by faulty ports on the motherboard or monitor, faulty cables, outdated drivers, or other hardware issues.

If you’re interested in learning more about why and how to fix your connectivity issues, you’ll want to stay tuned to the rest of this article.

Why Does My Monitor Not Work When Connected to The Motherboard?

There are many different causes for a monitor not working when connected to the motherboard. From the start, there are a few things to check before assuming you have a monitor that is defective and not working anymore.

You’ll find information stating that the problem is caused by connecting multiple monitors to two different sources, the integrated and dedicated GPU. This used to be the case when the iGPU was installed on the motherboard, which it isn’t anymore.

These days, the iGPU can be found on the CPU and it allows you to connect your primary monitor to your dedicated graphics card and your secondary monitor to the integrated graphics card without issue.

I’m not completely sure if this is entirely true for all brands but it is for a majority of the components out there.

1. The Integrated Graphic Card Has Been Disabled

Older motherboards or newer CPUs have integrated graphic cards that can be used to display video, just like a dedicated graphics card. Most integrated cards have nowhere near the same type of resources as a mid-to-top tier dedicated GPU but can still be used successfully when doing less resource-demanding work.

However, when connecting a dedicated GPU in the motherboard’s slot, the integrated graphic card will disable automatically. The card must then be re-enabled in BIOS in order for it to function. Scroll further down to find out how.

2. Faulty Ports And Cables

It’s always best to consider the most common causes of issues before looking into less common causes. The most likely cause of the connectivity issue between monitors and motherboards is that you are using the wrong cable to connect your monitor to the motherboard.

If you have an older motherboard that has a VGA port and you’re trying to use an HDMI cable, that’s your problem. You should be using a VGA cable to connect your monitor to it. 

If you have an HDMI cable that appears to fit in the port, but the computer isn’t working when connected, it’s possible that you have a high-density HDMI cable that isn’t compatible.

These cables are designed for use with newer video cards and monitors and are meant to be inserted into a special slot on the card or monitor. 

The shape of these slots is the same as for VGA ports, so they may appear similar at first glance.

3. Hardware Issues

If this is not the case, or if all of these connections are correct and still aren’t working properly, there may be an issue with hardware compatibility between your monitor and motherboard. 

Motherboards vary greatly in their compatibility with different monitors; some motherboards will only work with certain brands and models of monitors while others will work with almost any monitor.

Try doing some research on your specific model of motherboard (both its brand and model number) and see what comes up. 

Another issue that could be affected connectivity is that it could be that the motherboard does not have an integrated graphics chip, or it is defective. It’s also possible that the monitor connector cable itself is defective. 

You can try disabling the integrated graphics card in your BIOS and seeing if it detects your other graphics card. If not, it may be that your motherboard has been damaged somehow.

The easiest way to test this would be to connect a different monitor to your motherboard using a different cable.

If you still get no display, then it’s probably an issue with the motherboard itself. If another monitor works, then it’s most likely something wrong with your previous one; either the cable itself or something inside the monitor (an internal power supply or even just a blown capacitor).

4. Your CPU Doesn’t Have Onboard Graphics

Not all CPU’s come with onboard graphics. The iGPU used to be located on the motherboard but as of recent, it has been placed on the CPU. Not all CPU’s come with onboard graphics. In order to get this to work, you must make sure your components have the required features.

There are a couple of things you can try in order to get a monitor connected successfully to the motherboard. Based on these following criteria, you’ll hopefully have a better understanding as to why your monitor isn’t working even though it’s connected, and how to resolve the issue.

1. Make Sure the Monitor Is Powered

First, this may sound redundant, but make sure you’re actually powering the monitor by checking to see whether it’s plugged in and turned on.

If you have an older model that uses an external power supply, check to make sure that the power cord is connected to the power supply and plugged into a working outlet (a lamp or other device can help you test an outlet). 

If your monitor has a built-in power supply (most monitors do), check to make sure that the power cord is connected to the back of the display and plugged into a working outlet.

2. Enable Integrated Graphics In BIOS

If you’ve got a dedicated graphics card connected to the motherboard, the integrated card has most likely been disabled automatically. Activating it should allow you to connect your secondary or primary monitor to the motherboard once again.

This can be done within the computer’s BIOS, and will look a little bit different depending on the make of the motherboard.

Before we look at the steps you need to take once you’re inside BIOS, let’s find out which button you should use to access BIOS during bootup. Sometimes, these keys will take you to the bootup menu but from there you should be able to navigate to BIOS.

  • Asus: F2 or Del
  • Acer: F2 or Del
  • Dell: F2 or F12
  • HP: F10
  • Lenovo: F2
  • MSI: Del or F2
  • Gigabyte: Del

Try any of these if your motherboard wasn’t mentioned or if the key didn’t work:

  • F1 / F2 / F3 / F10 / F11 / F12 / Esc / Delete

Now to the actual process of enabling the integrated graphics card, which also looks a bit different depending on the make of the motherboard.

  1. Restart your computer
  2. Click the designated BIOS button when the computer starts up and the motherboard’s logo is shown.

The next steps look a bit different.

  • ASUS / MSI: BIOS -> Advanced -> System Agent (SA)\Configuration\Graphics Configuration -> iGPU Multi-Monitor: Enabled
  • Gigabyte: BIOS -> Chipset -> Internal Graphics -> Enable. (Auto will disable the onboard graphics if an external graphics card is connected to the motherboard, therefore, enable is the option we’re looking for).
  • Asrock: BIOS -> Advanced Menu -> Chipset Configuration -> IGPU Multi Monitor -> Enable.
  • Lenovo: BIOS -> Devices -> Video Setup -> Select Active Video: IGD -> Multi-Monitor Support: Enabled.

When finished, press F10 to save then exit.

3. Install Intel Graphics Driver & PCIe Video Card Drivers for Nvidia / AMD

If your monitor is connected to the integrated graphics card’s output but not displaying video, it could be because you haven’t installed the integrated graphics card’s drivers. Check to make sure the device is installed, otherwise download the drivers and install them.

  1. Go to Device Manager
  2. Check under Display Adapters to see if Intel, Nvidia, or any other integrated graphics driver appears next to your dedicated graphics card.
  3. If so, then you’ve got the driver installed. You could always look for the most recent version of the driver on the manufacturer’s website.

If you didn’t find the device: Go to the manufacturer’s website and search for it. When you have installed it the computer may prompt you to restart the PC. Once back up, the device should be visible in the Device Manager.

3. Connect the Monitor to Another PC

Next, try connecting your monitor to another PC or laptop—if possible, see if you can borrow one from a friend or family member to troubleshoot it. If your monitor works when connected this way, then your original computer may be malfunctioning.

You can also use your monitor with another PC or laptop by swapping out the video cables. If this doesn’t work, try using each cable separately if your monitor has multiple input ports.

4. Check Graphics Card Connections

You also might not have the graphics card’s monitor connector situated in the right place. Make sure it’s plugged into the right port on your motherboard because if the graphics card isn’t seated properly, things won’t work. Make sure all of its connectors are firmly pushed into their respective ports.

Most motherboards have a video port on them, but its function is usually to be used as an out-of-band management console, not to display graphics from your computer.

It’s designed for remotely managing servers and other headless devices.

You’ll probably hear an audible click when this does happen, and maybe feel some resistance. Make sure that the power supply is plugged into both the outlet and the back of the computer and turned on.

If you’ve done all of this and still aren’t seeing a display, may have a faulty graphics card or monitor, or they could be damaged. 

5. Reset BIOS

If you’ve tried some of the other methods, you might also want to consider looking into the BIOS settings. This requires a little more technical knowledge, but isn’t too difficult if you follow these instructions:

Turn off your laptop and then use a screwdriver to open the case. This will give you access to the motherboard. Look for a small battery and remove it for about a minute before putting it back.

After that, put everything back together and turn on your laptop. You will find that the BIOS is reset to its defaults and has probably fixed the problem.

6. Change Boot Device

The second way to enter the bios doesn’t require any tools. As you’re starting up your laptop, you’ll see the manufacturer’s logo displayed on the screen.

At this point, you’ll want to press “Del” or “F2” key (depending on the manufacturer) repeatedly to enter BIOS Setup Utility. 

Next, go to Boot > Boot Device Priority and make sure that Removable Device is ordered as the first boot device; if not, change it accordingly. Now press F10 to save the changes and restart your computer.

A monitor that isn’t working properly can feel frustrating, but trying out some of these solutions should typically resolve the issue in more cases than not.

Can integrated graphics run through GPU?

What is an integrated graphics card? An integrated graphics card shares power between the GPU and CPU, because the graphics card is built directly into the computer's processor.

Why is my integrated GPU not working?

This means the integrated graphics on your system may be disabled. Try rebooting the PC into the BIOS and enable integrated graphics. We recommend you contact your OEM or motherboard manufacturer for more information on how to access your BIOS settings.

Does a GPU disable integrated graphics?

You dont need to disable your intergrated graphics card, it will be automatically dusabled if you have a dedicated graphics card.