The reason that I pose this question is that recently my previous computer decided that it was done providing me any enjoyment in my life, (still not sure if it was suicide or murder, may have been the poisoned mushrooms).
I now have an off the shelf Gateway that has a second generation i5-2320 CPU @ 3.00GHz, 500GB HD, 16.0GB of DDR3 ram (came with 4.0GB) and integrated Intel HD Graphics 2000. Trying to be as intelligent as I could be about setting up the new computer I was advised to first hook up the keyboard, mouse and monitor to be sure it operated correctly out of the box. This it did very nicely. I had a spare video card but the graphics I was seeing on the monitor was pretty good, in fact as good as what I remembered the previous computer was with a GeForce GT100 1GB DDR2 video card. There are connections for HDMI and VGA off the mother board. I tried the HDMI on the 46in LED TV and it didn't like it much, so I connected the VGA output to it and sent off for a HDMI to DVI connector as my 24in LED monitor has only DVI. I now have two monitors working in clone mode and am very pleased.
Here is a screenshot of Intel Graphics and Media Control Panel.

As you can see it gives you a lot of options/functions in setting up your system. I can set up a single monitor or dual monitors running in clone or extended mode.
Now I am in now way an expert when it comes to computers, just one of the many users out there that has a computer to do email, searching the internet and doing desktop customizing and occasionally playing a game of solitaire. I in no way want folks to interpret what I am saying and showing as meaning that there is no need to install a separate video/graphics card. I'm sure that for a lot of you that are serious gamers or folks that use their computer to earn income doing 3D rendering and such that you couldn't do that kind of thing with integrated graphics.
So what do you think, heard or read on new versions of integrated graphics?