New Video Card

and whole new PC

So for those of you that don't know, I work at Stardock, on the game team.  On July 19th, I will have been here for 2 years.  So far, I've worked on GC2 Dark Avatar, GC2 Twilight of the Arnor, Political Machine 2008, and our unannounced fantasy TBS game. 

Our team is pretty small, but I'm neither the newest game developer, nor one of the more senior ones.  I guess I'm somewhat of the middle child.

Anyway, while working on PM2008, I had complained that my video card sucked.  I don't know how much it sucked in comparison to other team members, I just knew that I wanted to get a better one.

I mentioned it to the right person, and 3 weeks later, I was presented with a completely new work PC - not just a new video card.  Needless to say, I was quite surprised, and very thankful.

Here's the stats:
--------------------
-PROCESSOR: Intel Core 2 Quad (Q6600 @ 2.40 GHz - 4 CPUs)
-RAM: 3 GB
-OS: 32 bit Vista
-VIDEO CARD: NVidia GeForce 9600 GT

:CONGRAT:

103,495 views 28 replies
Reply #1 Top
Good system. The 9600 is a great card, and the Core 2 CPU rocks.

I'm running a very similar system, except I'm using 64 bit Vista and 6 gigs of RAM.

That new computer should serve you well :). It's a great setup.
Reply #2 Top
Though as a gamedeveloper. That computer lacked! Though, if the experience is better then the old. Good for you :D Then you can continue to make good games :P

Birger :)
Reply #3 Top

Hey, my PC Card sucks too!

+1 Loading…
Reply #4 Top

very nice setup i have just updated my video card to NVidia GeForce 9600 GT and WOW  call of duty flys now i can see why every one was raving about the game now lol

 

32bit vista

2gig RAM

duel core 2.40

 

runs very nice thank you  

 

 

Reply #5 Top
May I suggest the NVIDIA GeForce 9800 GTX? In norway, this card has gone really cheap..

Cards

Birger :)
Reply #6 Top
Na, for the resolutions you'll be working at a 9600 is perfect.
Reply #7 Top
Hey, my PC Card sucks too!


ROFL !!!

Careful they might give you CrazyC0330's old card :LOL:

Regards
Zy
Reply #8 Top
Curious. Why 32-bit Vista? That sort of caps your RAM at 3 gigs, so you'll have to replace again in probably less than a year.

And didn't I see a post a long while back by The Big Cheese himself complaining that Vista should have gone 64-bit only or something?

Curiouser and curiouser....
Reply #9 Top
depending on what your using it for, i would personally rather have a duel core with a faster clock speed than a quad (most games dont support multicore, more and more are, but thats a lot of unused power)

If you do a lot of regular multitasking though, it could be worth it.
Reply #10 Top
Dual Core / clock speed

They have hit a wall on practical cpu speed re heating issues, so they had no choice but to go down a multi core route to try and keep up overall processing speed. It is cheaper by far to provide a second core on silicon, than the investment currently needed to overcome a virtually impossible conundrum re physical limitations on increasing single core clock speed, due to heating effects.

Multi Core is though, a known dead end for consumer hardware. On a 2-5 year view new technologies that will overcome the immoveable laws of physics in current cpu designs, will be on stream - at that point watch multi core numbers drop through the floor.

So if you are at the software end of the pipe, your Return on Investment equation is based on a 2-5 year return, because you know multi core will go through the floor at consumer level, but dont yet know the detail of the new cpu designs coming down the pipe. Hence the number of applications that do now, or are in the works to, take advantage of more than two cores - the User Base will be too small on a 2-5 year view to make the Return on Investment worth it for most of them.

Now in the middle of this design/ROI nightmare the hardware guys are struggling with, along comes the 64 Bit hardware implications driven by the software end.... add to that a sprinkling on the desert from the software end struggling to get OEMs/Vendors to write applications, 32 bit drivers and compilers for Vista, let alone 64 Bit ones for an as yet unestablished market and unknown numbers of cpu cores as a "norm", and its not surprising that the guys accountable for a market space of Hundreds of Billions of Dollars are a little cautious.

As always there are no simple solutions, never mind anyone's favourite bandwagon, if there were they would be in place by now .... many things are technically possible, but fall over on closer scrutiny at a Business ROI level. Its always the latter where favourite bandwagons fall over ...

At the end of the day, I'm glad I am just the end user who uses this stuff, and not actually be accountable to work out the route through the related Business minefield to get it to me :LOL:

Regards
Zy
Reply #11 Top
Does that mean that if I complain I get a better computer as well?? :LOL:
Reply #12 Top
No kidding. My computer sucks and come to think of it so does my wife's :) I wont even go into the kids computers as they get the handmedowns from us

Hell I will even shed a few croc tears along with the complaining  :SNIFF!: 

Reply #13 Top
That's pretty nice, not overkill, but nice. I'm saving for a new computer this summer. It's a lot cheaper if you build it yourself. So it's going to have a 9800 gtx, intel q6600(2.4 ghz), and other nice stuff. I know it might be a little bit overkill for the moment, but seeing as how i only get a new computer every 4 years, it's good.
Reply #14 Top
I'm actually glad that Stardock doesn't buy thier game developers teh very top of the line.

Most of us users don't have that kind of rig and I think sometimes development house forget that.

Enjoy the new toy!
Reply #15 Top

Nice. I was gonna get a Q6600 as well but settled on a E6850 instead because its all you really need for today's games.

Eventually games will use more than 2 cores but when that time comes, it'd be time to upgrade the CPU anyways. X-(

Reply #16 Top
Multi Core is though, a known dead end for consumer hardware.


How so? The number of transistors that can be placed on a chip continues to increase, and multicore allows them to continue increasing the amount of processing power per watt. I would disagree. I think we will be seeing quite a bit of doubling in the next few years: 8, 16, 32, 64, even 128 cores I think is possible before they start hitting the next physical barrier. Multicore is not going away from consumer computing, not by a long shot. Especially not as multimedia applications become more common, and I know there are plenty of changes coming in programming languages that will allow developers to take much better advantage of multicore CPUs.

add to that a sprinkling on the desert from the software end struggling to get OEMs/Vendors to write applications, 32 bit drivers and compilers for Vista, let alone 64 Bit ones for an as yet unestablished market and unknown numbers of cpu cores as a "norm"


Applications can be written as 32 bit if they do not need 64 bit support. Vista is perfectly backwards compatible with 32 bit applications, and I'm sure future OSes will be as well. It will be quite some time before OSes start dropping 32 bit support. I'm thinking there are many years left for 32 bit compatibility. It's hard enough convincing Microsoft that 64 bit should be the new default, even though 4 gigs is becoming more common.

Drivers for Vista are not much of a problem anymore. If the computer meets the minimum requirements, chances are there are drivers for it. That argument is much less valid today than it was at Vista's launch. Ditto for compilers. As far as I know, both Microsoft and GNU's compilers have support for compiling to a 64 bit target.

I'm not sure "struggling" is the term I'd use. Developers make most of their money creating new software and rewriting existing software. Sure, it's a lot of work, but hey, hard work is what drives our economy. Writing software is what pays our bills. It is good that we "struggle" to write new software - it keeps food on our tables.
Reply #17 Top
Mine sucks...

*cough* 512 ram nvidia 5200 *cough*
Reply #18 Top
I gotta go with CobraAI on this one. For a typical commercial desktop application, where devs would normally struggle with the issues of 32 vs. 64-bit, it's generally a non-issue. Think of QuickBooks, Goldmine, and other such event-driven applications here. These apps don't do a whole lot that's terribly CPU-intensive.

And the tools abstract the difficult tasks these days. Visual Basic, Delphi, C++ Builder, C#, etc. The devs usually don't need to worry about the low-level stuff. It's a rare situation that even an application written in Java will have CPU problems if it is just a desktop app. Generally slowness is going to stem from a design problem or disk I/O.

For those apps, there's no reason to worry about faster and faster processors, because on a multi-core system, they will get a full processor's 100% attention most of the time. And with so many libraries already out there for making apps multi-threaded, even these simplest of applications will be able to benefit.

For more low-level applications where CPU really is a problem, writing multi-threaded code has been a good idea (when applicable, of course) for a terribly long time. Seti@home uses multi-core systems very effectively, and has for at least a year or two. Not all apps will be able to make much use of a 256-core system, I'll wager, but many will take advantage of the dual and quad core systems we have today.

And the tools, man! Check out things like IBM's Thread Building Blocks for C++. Makes it dramatically easier to build a multi-threaded application than most old-school devs are used to. And because threaded apps don't need a single core per thread, as long as you can expect a dual-core system, your app will almost certainly perform a great deal better if you use threads everywhere it makes sense.

So when we get past the speed barrier once again, why would we stop with multi-core systems? Even for apps that don't need multiple cores, they allow Windows (or OS X or Linux or BSD or ...) to run more effectively just by being able to dedicate one core to the app in question.

Unless it somehow becomes significantly cheaper to have a single-core system, I just don't see any reason they'd go away. And unless the development tools turn out to be crap (and from my (admittedly limited) experience of TBB, I just don't think that'll happen), there's no reason for devs to avoid multi-threading their apps when the average household computer is at 2 cores today or will be soon.
Reply #19 Top

I used to belong to the crowd of "the developers should have lower end machines so that they will make the game perform better on crappy machines." 

That's all well and good, but we spend a lot more time actually working on the game as opposed to testing, and particularly during crunch time, you don't want your machine to be chugging.  We recently got Incredibuild, which distributes the job of compiling over multiple computers, reducing full rebuild time for Twilight of the Arnor to about 5 minutes.  And that's still too long when you're having to tweak something, load up the game, and try again.  Granted, not everything is going to require a full rebuild and if the change isn't in a header file the rebuild time generally isn't that bad. 

Before anyone asks, we tried using pre-compiled headers at one point, but we change stuff in header files way too frequently for that to be effective, adding member variables, additional function parameters, etc. and if we use the interface method, that makes it harder to debug.

Therefore, most of the developers have a primary machine and a test box, and the test box is always much crappier than the primary system.  Although, for testing multiplayer in TPM2008, I could have really used a test box that could compile faster.   

 

Reply #20 Top
512 ram nvidia 5200


Wow, and I thought I was a bit behind using a 6800 when everybody else was using 8 series and 9 series cards. By the way, the 9600 is a great card for a budget price. It's the card that finally convinced me to upgrade.

And the tools abstract the difficult tasks these days. Visual Basic, Delphi, C++ Builder, C#, etc. The devs usually don't need to worry about the low-level stuff. It's a rare situation that even an application written in Java will have CPU problems if it is just a desktop app. Generally slowness is going to stem from a design problem or disk I/O.


This is very true - the number of bits is largely abstracted away, and it's just a matter of flipping a switch in the compiler: The compiler will happily compile to both targets.

And talking about compilers - today's languages aren't even truly compiled that much anymore. Even C++ and C# are converted to bytecode for the CLR and not truly compiled if you're doing development using Microsoft's .NET tools. It'll happily translate itself into either architecture.

And with so many libraries already out there for making apps multi-threaded, even these simplest of applications will be able to benefit.


I'm gonna have to say be careful about multi-threaded libraries: Just because some code in a library is multi-threaded doesn't mean your own code is multi-threaded. Your own code can still be single threaded and can bottleneck a single CPU if you're not careful. The future will need to have a lot more support of threading in the language itself, not just the libraries. If you want to see where threaded programming languages may be headed in the future, take a look at Erlang, Squeak, and F#.

That's all well and good, but we spend a lot more time actually working on the game as opposed to testing, and particularly during crunch time, you don't want your machine to be chugging.


Yeah - the test machines should be separate from the development machines IMHO. The development machines do a lot of compiling and debugging, so you really want them to be good machines.
Reply #21 Top
I'm gonna have to say be careful about multi-threaded libraries: Just because some code in a library is multi-threaded doesn't mean your own code is multi-threaded. Your own code can still be single threaded and can bottleneck a single CPU if you're not careful. The future will need to have a lot more support of threading in the language itself, not just the libraries.


I'm not talking about using somebody else's library as a part of your application, I'm talking about libraries specifically for doing multi-threading, such as Thread Building Blocks. That thing is all about making your code multi-threaded in the really obvious cases, such as having a vector (thread-safe version) with a bunch of values that all need to be processed in some way, but each can be processed independently. You do a call to parallel_for and the library turns a (mostly) normal "for" loop into a multi-threaded loop, processing chunks of the loop in independent threads.

Obviously this requires you to know what you're doing, and is a pretty simplistic example (I don't know enough about threads or TBB to really get into the details), but my point is that libraries like this are going to make programmers start thinking in terms of threads a lot more than we do today. This will mean multi-core systems will become more useful instead of less useful.
Reply #22 Top
One thing I'm a bit worried about is developers just plunging into threads without knowing what they're doing. You don't actually want to divide any loop into a thread: The overhead will kill it if you're not doing much in the loop. There is a lot of overhead associated with creating and maintaining threads, so they are best used where there is a significant amount of computation is being done.

Another thing I'm a bit worried about is that developers will get too bent out of shape with concurrent primitives and the shared memory model. Erlang is a popular language for concurrent programming, but uses a message based model instead of the popular (but difficult to use) shared memory model. I'm a bit worried that too many developers will be stuck on the locks, semaphores, and other low level primitives rather than handling threads at a high level. Hopefully, future programming languages will make concurrent programming easier, perhaps even invisible, to the developer.
Reply #23 Top

I am thinking on buying the new Ati 4850,It is cheaper than Nvidia 9800gtx but still punch the same power. 

Reply #24 Top

My computer :

Intel E4500 2.2Ghz O'cd at 2.97, 2 gb CORSAIR XMS2 PC6400 800Mhz O'cd at 900Mhz,Gigabyte P35-DS3L and Sapphire Ati 2600Xt 512Mb. ;)

Reply #25 Top
Getting my new comp in a month or two so I cant Wait!