Computer question - Fixed :D

Recommended Videos

Mekado

New member
Mar 20, 2009
1,282
0
0
Thandran said:
Nope.

That one is disconnected and the HDMI cable is connected. The DVI cable leads to the monitor (it's the one beneath the green jack, right?) :)


The blue circle is the DVI port for your 560Ti, where your monitor should be connected yes, unless you have HDMI on the card too (i can't see because of the cables hehe)
 

Thandran

New member
Feb 19, 2011
183
0
0
That means I have to disconnect the HDMI from the monitor and plug in the DVI? :)
 

Mekado

New member
Mar 20, 2009
1,282
0
0
Thandran said:
That means I have to disconnect the HDMI from the monitor and plug in the DVI? :)
Unless your 560Ti also has HDMI ports yes, the little quality you'll lose switching from HDMI to DVI will be almost un-noticeable and the performance increase gained by using your "good" card will more than make up for it.

At the moment you're driving a F1 car with a Nissan Sentra engine, and a F1 engine in the trunk, unused ;)

Note: It might not be "hot-swappable" meaning you might not be able to do this while your computer is turned on (switching video outputs is rough for a computer), reboot should do it.If it does not you'll have to go in the BIOS and disable your integrated card to force windows to output on the "addin" card.
 

Thandran

New member
Feb 19, 2011
183
0
0
Problem:

I disconnected the HDMI cable and tried connecting the white DVI cable to the monitor where I discovered that it also has a HDMI type of connector with which I connected it to the monitor (is that an ok DVI cable?).

I did this while my computer was shut off. Then I booted it up and the screen was black. Like it wasn't connected. Then I saw that I could connect both the HDMI cable from the integrated card and the DVI cable from the graphic card into the monitor.

I did that and now I can at least see. :D

Will that help or is further assistance acquired? Thanks. :)
 

BeerTent

Resident Furry Pimp
May 8, 2011
1,167
0
0
Thandran said:
[...]- Nvidia GeForce GTX 560 Ti[...]
I'm taking a look at this, from what I understand, you should be at the point where some simple observing and problem solving should resolve your problem. Your going to need to buy something from the looks of it. According to a bit of research, you've got two DVI connectors, and a mini-HDMI connector. You've got two options...

Buy a Mini-to-Standard HDMI adapter and connect to your Video-Card. This may or may not send audio, along with video.

Switch to DVI, It appears that you already have an audio cable running from your Mainboard. You should have a port like THIS [http://www.computertroublesolver.com/image-files/dvi_connector.jpg] on your monitor, unplug and remove the HDMI from the setup, and plug the other end of your DVI into that.

Either way, you'll be transmitting digital video at the same speed.

Make these changes to your system while the PC is off! Update your video card drivers (For your Nvidia card) afterward!

I also misread your post.

check to see if there an option to change inputs on your monitor. Change it to DVI, and disconnect the HDMI Cable before going into the BIOS to disable the integrated card, You might not need to disable the card!
 

Mekado

New member
Mar 20, 2009
1,282
0
0
Thandran said:
Problem:

I disconnected the HDMI cable and tried connecting the white DVI cable to the monitor where I discovered that it also has a HDMI type of connector with which I connected it to the monitor (is that an ok DVI cable?).

I did this while my computer was shut off. Then I booted it up and the screen was black. Like it wasn't connected. Then I saw that I could connect both the HDMI cable from the integrated card and the DVI cable from the graphic card into the monitor.

I did that and now I can at least see. :D

Will that help or is further assistance acquired? Thanks. :)
Yeah that means windows won't output to your 560Ti, it outputs on the integrated card, at the moment it's exactly the same as it was.

What you would need to do is go into the computers' BIOS and disable integrated graphics, which would force the computer to output on the other card (your geforce). It's a little more complicated though, have you ever gone into the BIOS ? (usually F8 at bootup or something like it)
 

Owyn_Merrilin

New member
May 22, 2010
7,368
0
0
Thandran said:
Problem:

I disconnected the HDMI cable and tried connecting the white DVI cable to the monitor where I discovered that it also has a HDMI type of connector with which I connected it to the monitor (is that an ok DVI cable?).

I did this while my computer was shut off. Then I booted it up and the screen was black. Like it wasn't connected. Then I saw that I could connect both the HDMI cable from the integrated card and the DVI cable from the graphic card into the monitor.

I did that and now I can at least see. :D

Will that help or is further assistance acquired? Thanks. :)

You need an adapter to go from DVI to HDMI. Contrary to what Mekado implied, DVI and HDMI are exactly the same when it comes to basic video output -- so you won't even lose any video quality, unlike what he said. The difference between them is that DVI doesn't do audio, it doesn't do HDCP copy protection (which is used by blu-ray), and there is also a mechanical difference in the form of the shape of the plug. All you need to plug a DVI cable into an HDMI port, or vice versa, is a little adapter that sends the pins of one to the appropriate pins of the other.

Edit: Completely misread that. What you want is to plug the DVI cable into the DVI port on the monitor. There should be a setting on the monitor itself to use that input. You should be able to have both plugged in at once, and select between them with the monitor's settings, but I'm not sure why you would want to. The DVI cable from your GTX 560 is all you need.
 

Thandran

New member
Feb 19, 2011
183
0
0
And I've gotten:

"You are not currently using a display attached to an Nvidia GPU.'

Sight. I need to get some sleep first and then I'll continue following your advice. I haven't ever tinkered in BIOS and I'm a bit scared to do so. I don't wish to cause damage that is not needed. :(

I'd still like to thank you from the bottom of my heart for your patience and wisdom.


BeerTent - I'll do what you advise after I get some sleep. :)
 

BeerTent

Resident Furry Pimp
May 8, 2011
1,167
0
0
Keep in mind, I edited that post like a bajillion times too. I think I'm done with it. ;)
 

Owyn_Merrilin

New member
May 22, 2010
7,368
0
0
Thandran said:
And I've gotten:

"You are not currently using a display attached to an Nvidia GPU.'

Sight. I need to get some sleep first and then I'll continue following your advice. I haven't ever tinkered in BIOS and I'm a bit scared to do so. I don't wish to cause damage that is not needed. :(

I'd still like to thank you from the bottom of my heart for your patience and wisdom.


BeerTent - I'll do what you advise after I get some sleep. :)
Before you change anything in your BIOS, try going into your Nvidia settings and checking to see if there's anything in there that does what you want. I don't know Nvidia, but AMD's drivers have a spot where you can choose which display adaptor out of the ones you have installed to use. I'd imagine Nvidia has something similar. Also, like I said above, make sure the monitor itself is on the right input; if it's on the HDMI setting, you won't see what's coming from the DVI port, and vice versa.
 

Dryk

New member
Dec 4, 2011
980
0
0
Thandran said:
Sight. I need to get some sleep first and then I'll continue following your advice. I haven't ever tinkered in BIOS and I'm a bit scared to do so. I don't wish to cause damage that is not needed. :(
Just stay away from any option with numbers and don't disable important components and it'll be fine
 

Thandran

New member
Feb 19, 2011
183
0
0
Ok, I'm back. :)

I checked on the internet and discovered that I have a DVI to male HDMI cable. Is that ok or should I go and buy another?

I tried going to the nVidia control panel and look for setting but got a message that said:

'You are not currently using a display attached to an Nvidia GPU.'

I'll check the monitor setting then and update what I've found.

Thanks for the help so far guys. :)
 

Thandran

New member
Feb 19, 2011
183
0
0
Ok... I've found out that on my monitor settings I have two options available. It shows me that my monitor 'can act' as two monitors.

One is connected to the Intel HD 3000.
The other to the GeForce card.

I tried making the other monitor with the GeForce card my main one but the screen always becomes black after I make it my main. Thus I'm forced back to the one with the Intel chip so that I can see.

A bit tiresome to say the least. Well at least I'm making a bit of progress... :)

I think I found a thread on the internet that would best describe my problem:

http://www.tomshardware.co.uk/forum/359671-15-solved-monitor-windows-displays

With a picture:

http://imgur.com/mvMuJ

:D
 

Thandran

New member
Feb 19, 2011
183
0
0
mindlesspuppet said:
Goto your Device Manager

You can do so by right clicking my computer going to properties and then the hardware tab, or simply press Windows Key + Pause|Break then go to Hardware tab and click device manager.

In the "display adapters" tree in you Device Manager, does it show both the Intel and Nvidia card?
Yes it shows both cards:

- Intel(R) HD Graphics 3000
- NVIDIA Geforce GTX 560 TI

:)
 

EHKOS

Madness to my Methods
Feb 28, 2010
4,815
0
0
number2301 said:
Next thing to check would be the amount of RAM you've got, 4Gb would be plenty. Windows Task Manager can tell you how much you've got.
I don't know about that. I have 8 Gigs and windows uses about 3.5 on idle. And I only run the bare necessities. Maybe my computer just sucks though...
 

mindlesspuppet

New member
Jun 16, 2004
780
0
0
Thandran said:
Yes it shows both cards:

- Intel(R) HD Graphics 3000
- NVIDIA Geforce GTX 560 TI

:)
Yeah.. sorry, I should have saw that'd be the case by your previous post.

Have you tried simply reinstalling the Nvidia drivers? You probably have a disc for them laying around, if not you can go to Nvidia's website and get them. The install process should automatically set the Geforce as the default display driver, then you'd simply have to change the input source on the monitor itself.
 

direkiller

New member
Dec 4, 2008
1,655
0
0
Thandran said:
Hope I'll answer some of the questions:

- I have a desktop
- I have 8GB of RAM
- I don't know where I have the monitor plugged in... I'd need a picture that would show me where the motherboard ports are and where the dedicated video card ports are.

Thank you for your help in advance though. :D


those bottom 4 slit looking things(where it says video card) are where cards go(your computer may have more may have less)
one of those slots on your computer will have a monitor plug slot

odds are your plug will be diffent then the one on the picture as that is an older cable type

is your monitor plugged into one of those areas or up higher?

if it's up higher your plugged into on-bored graphics if it's down there your good
 

Thandran

New member
Feb 19, 2011
183
0
0
mindlesspuppet:


I've recently downloaded the newest drivers. What would I have to do to completely reinstall them? And how do I change the input source?

direkiller:

At first my monitor was plugged into the on board graphics and I tried changing it to the dedicated graphics card. But it was like my monitor was not plugged into anything (the screen was black) and I was forced to go back to the previous style.

Seems like I'll have to visit a computer repair shop.

Thanks for the help guys. :D