dystopic dystopic

computer technology chat

computer technology chat

this thread originally started with the title "computer upgrade advice" because i was having computer problems, but more recently i've been using it simply to discuss computers in general -- i thought a new title and openning post was in order.

so feel free to post anything you'd like about computers - requests for advice, newly released technology, whatever. i love learning about this stuff, and often one of the best ways to learn is to listen to other people's interests, questions and concerns.

to be clear, i've also been participating in other forums... but honestly, the folks here on the GC2 forums are so much more friendly than the average forum group.

thanks, cheers, and all that other good stuff :)
274,380 views 337 replies
Reply #301 Top
So, a system would need to be 3600 times faster. Apply Moore's law to that!


okay. i started college as an aerospace engineering major, and decided to change because i didn't want to deal with the math. now, i get a strange enjoyment over working out silly math problems in my head.

man, i'm a dork
:D

i actually mis-remembered Moore's law. the number of transistors that can be squeezed into the same die doubles approximately every 2 years, no every 1.5.

i think this would be the formula:

3600 = (2^x)/2

multiply by 2

7200 = 2^x

and now a logarithm to solve for x

12.81≈x

so, let's give 'em an even 13 to iron things out. of course, Moore's law isn't a law of physics; it's a law of economics, and those laws are subject to change. i think there's more to consider, too. i think a lot of scenes get some post-rendering airbrush work to help smooth out the remaining edges in ways a computer can't (yet). though, that math totally doesn't sound right... hmm counting by hand i get less than 24 years (you have to raise 2 to the power of 12 for it to exceed 3600; 12x2 since it doubles every two years).

i'm not seriously making a prediction that we'll see movie-quality CGI on home computers in 13 or even 24 years. i think, like that magazine rls669 linked, this shows just how fast computers develop.
Reply #302 Top

i'm not seriously making a prediction that we'll see movie-quality CGI on home computers in 13 or even 24 years. i think, like that magazine rls669 linked, this shows just how fast computers develop.

I think we'll be seeing that quality of graphics within 10 years. There's more than speed to consider, modern CPUs and GPUs are both using parallelism now instead of just more speed, hardware and software physics engines are evolving to handle that part of the problem much more efficiently than doing the same task on general purpose equipment, etc. Also real-time graphics use a lot of shortcuts that don't produce a physically accurate result to the nth decimal place like pre-rendered cgi, but are "good enough". High end games now aren't far behind movie cgi from 10 years ago.

Affordable, decent sized solid state hard drives just got little closer to reality:
WWW Link
Reply #303 Top
Industrial Light & Magic (ILM) has the largest renderfarm on the planet, with 1,000 dedicated rendernodes + 3,000 workstations that turn into rendernodes at night, and rendering is 24/7. The industry uses the Pixar Renderman specification for rendering software.

Even hobby level software uses Renderman compliant render engines.

You can do movie quality animation shots on home computers. It just depends on your skill and experience with the 3D CG software you have. The more you work at it, the better you get.

The CPU is used for rendering, not the GPU.

My hobby is 3D CG.
Reply #304 Top
The CPU is used for rendering, not the GPU.


in that case what are Quadro's and Fire GL's for?

also, i'm kinda curious about something else. how do that many computers work together? i mean, presumably the rendering is getting divvied up between the rendernodes and workstations. is each node assigned a frame to render?



and on the subject... i actually like the visual effects from Jim Henson studios more than Industrial Light & Magic. they've done more than just muppets. The Dark Crystal, back in the 80s, was them. and even though it doesn't have a lick of CGI, i still think it's one of the most visually effective films ever. more recently they did Farscape, which again, one of the most visually appealing science fiction worlds ever (up there with Gieger's artwork from Alien, in my book). to be sure, the CGI scenes from Farscape weren't as good as something ILM could do. but they also don't try to render characters in CGI. i'd take a Rigel or Pilot over a Jar-Jar Binx any day of the week.

personally, i'm wondering how long it's going to be before we see convincing people and characters in CGI. skin is still too perfect, let alone the issues with hair; lip syncs don't line up closely enough to convince me, but unlike regular animation, where lip movement isn't even close to looking real, in CGI animation it's just off enough to hurt my brain.

i think faces will probably remain one of the most elusive goals in CGI. the human brain has the equivalent of a dedicated processor to make sense of faces, part of the inferior temporal cortex (and actually, we have two of them). the point is, it's very easy for us under normal circumstances to detect inconsistencies with a face. whether that means we're looking at a real person in a strange mood or an artificially rendered person is something, i believe, we use other parts of our brain to determine. perhaps CGI would be benefited by damaging those areas of the brain :LOL:

anyway, yeah, a bit of a tangent, but at least somewhat related.
Reply #305 Top

The CPU is used for rendering, not the GPU.


in that case what are Quadro's and Fire GL's for?

The workstation cards are optimized for rendering large numbers of polygons while working in a 3d application, the gaming cards are optimized for heavy texture and shader work. Once you have the scene set up, it's the cpu that renders out the image or animation frames.

In the past the cards were separate lines, now the nvidia and ATI cards are, for the most part, physically identical to their gaming counterparts. So you can convert a GeForce to an equivalent Quadro with a bios flash and different driver. The system will then recognize it as a Quadro. I did this with my Radeon 9800, converting to a 9800-based FireGL, but changed it back after a while because the performance increase in 3dsmax wasn't enough to make up for the loss of gaming performance.

And now the line between cpu and gpu is blurring, because there are renderers available (nvidia's Gelato being the most common) that will use the gpu to render out cgi.

also, i'm kinda curious about something else. how do that many computers work together? i mean, presumably the rendering is getting divvied up between the rendernodes and workstations. is each node assigned a frame to render?


It depends on the renderer and the work being done. In an animation the job will be split up by frames. For single images most renderers can assign different parts of the image to different machines (or cpu cores). mental ray uses bucket rendering and on my quad I'll have four 64x64 buckets being rendered at one time. Cinema 4d uses scanline rendering so it will divide the render into 4 horizontal sections.



Reply #306 Top
i have to admit that kinda sounds like fun, 3D CG as a hobby i mean. i don't know the first thing about it, but i like anything where i can be artistic, and i'm pretty good at learning the manual and technical side of things (i'm a pretty decent painter and stronger photographer, including knowing photoshop very well).

out of idle curiosity, how does one go about taking up 3D CG as a hobby?
Reply #307 Top
Get a 3d application and then do tutorials until you get proficient enough to focus on the results instead of the process. All the major apps have either demos or free "personal learning" editions. The PLE will usually only output low-res, watermarked renders but be fully featured otherwise. You might want to lood at Blender, a free open source 3d app that is continuously evolving. The main problem with Blender is that like many open source apps, it has an interface that was designed by programmers.
Reply #308 Top
Most 3D CG apps rely on OpenGL for the UI and preview while working on things. FireGLs and in particular, Quadros are optimized for this.

To start, look at:

http://www.e-onsoftware.com/ Vue Easel through XStream

http://www.daz3d.com/ Carrara, Daz Studio, Bryce, Hexagon

Poser changed hands (again!) to Smith Micro.
Reply #311 Top
hehe sweet :) i don't think i could put that on my computer though. it wouldn't match the look i'm going for (solid black with green lighting and hints of purple).

you know i haven't talked much about my case modding ideas and plans, and i don't know if it's something that interests any of you. but my own plans are rather extensive. i found that badge yesterday while i was browsing for parts for a mod i'm really excited about. i'm going to mod my right side panel (because how many modded right panels have you seen?). i guess i'm excited because i had to actually think through this one myself, and as far as i've found, no one's done anything quite like it. the idea is to create a 'hidden' circuit pattern in my right panel that can be made visible by turning on a lighting switch.

basically, i'll start by cutting out a really big window and cutting a panel of acrylic to match. i'm going to spray paint the one side of the acyrlic black (the inside), and then etch out a circuitry pattern. i might use a custom decal instead of etching (each option has its pros). got the basic idea from this guy:
http://www.hardforum.com/showthread.php?t=1242685

so visually a circuitry pattern basically consists of lines and dots. for the 'dots' in my circuit pattern, i'll dremel in small intentations and hot glue 3mm LEDs into them; the LEDs will be on one of these sequencers:
http://cgi.ebay.com/Led-Chaser-Sequencer-GREEN-12-Detachable-Cable_W0QQitemZ110198971830QQihZ001QQcategoryZ294QQrdZ1QQssPageNameZWD1VQQ_trksidZp1638.m118.l1247QQcmdZViewItem

since i'm planning to use more than 8 LEDs, i'll either have to splice in extras or purchase more than one of those sequencers. if i splice extra LEDs, i need to add resistors, and that means i need to figure out if there will be a problem with using resistors and multiple LEDs with the sequencer, since it uses PWM to achieve dimming effects on the LEDs.

in either regard, next will come the EL wire, 3 strands of 1.3mm thick wire on a sequencer, spiraled together. i'll lay the spiral down along any circuit path on the panel, securing it in places with electrical tape as needed.

once all the lighting is laid down, the inside of the entire thing will be covered with black electrical tape, to reduce light pollusion both inside the case and on the etched surface as well. with the sequencer on the EL wire, i'm hoping the 3 strands will create the apperance of movement on the visible side (data moving along the circuit).

the goal is for the panel to look solid black, more or less, while the lights are off. but flip on a switch, and the side panel jumps to illuminated life.

i'm sure complications will come up when i do finally start this case mod. the sequencer, for example, has a button that'll let you control the sequence pattern (17 patterns and 3 speeds to chose from!). i'd love if i could figure out a way to access that button from the ouside of my case (if i can't figure something out, at least the sequencer will remember its settings after being powered off). the entire thing also needs to be kept "short" enough for me to still put the panel back on the case! :LOL: (that, and leave room for me to run a few functional cables behind my mobo tray).

don't know if this is the sort of thing that interests any of you guys, but i'm excited about this idea, and i'm also kind of proud that i've been able to find all the pieces i need to implement it. the original idea started when i saw a write up of someone making a custom PCB of a similar pattern, but used to signal HDD activity. now, if i could figure out a way to tie the speed of both sequencers into my HDD indictor line, but still be able to turn it off with a switch, without losing use of my regular indicator light while the mod was switched off--that'd be amazing. i don't think i could do that without logic--i wonder if an undergrad comp sci or electrical engineering major could do it... they'll work for cheap :HOT: 

in either regard i'll definately be creating a work log when i finally get around to starting my case mods.
Reply #312 Top
hmm... after actually measuring things, i'm not so sure.

well, ideas are never a bad thing. but neither is prudence.

anyway, check this out:

Motorized Madness
Reply #313 Top
enough about case modding, back to actual technology.

the 9600GT his retailers today.

the good news: they're retailing for $180-210.

the not-surprising news: it's an 8800GT with fewer unlocked stream processors and a "9" slapped in front of it.

here's a review: WWW Link

while this doesn't particularly excite me as a new piece of technology, it is kind of exciting because:

1) the 9800 series should follow soon;

2) the 9k series seems to a new technology oriented towards higher resolutions (and i run at 1920x1200);

3) it's cheap. i'm hoping that'll mean the regular 9800s (not the GX2's) won't be too insanely priced.

in general i think owners of a 8800GT, GTS 512, GTX or ultra don't have much to worry about. from what i've read, the G92 chip doesn't add any transistors over the G80s; the only thing they did with the die shrink was make the chip HDCP complaint (which to be sure, is a good thing for many people). but with the exception of the 9800 GX2, i'm guessing we can expect to see the 9800 series to look a lot like the 8800 series, and probably be priced similarly.
Reply #314 Top
anyone hear about intel refusing to lisence its upcomming CSI or QuickPath technology to nVidia? this would cut nVidia out of building intel chipsets in the next generation of CPU technology. seems like they're trying to strong-arm nVidia into giving them SLi support.

i don't think nVidia will have a choice. while it's true that SLi is found on only a very small fraction of systems, it also sets a precident and builds company image. i think nVidia will have to give them SLi one way or the other. they could refuse it, at which point they'll be out of the intel chipset game (for workstations at least), and at that point, they'd have nothing to gain by keeping SLi from intel. or they just give SLi to intel and get CSI, then implement some other technologies via their own chipsets: the physX techs they got from aegia; triple SLi, or perhaps even something like "SLi 2.0" that'll scale up a little better. that's one advantage i do see about radeon VGAs and the crossfireX technology (note, crossfire-X is much better than standard CF). where you get a 20-30% boost via SLi in most cases, from what i understand you can get anywhere from 50 to 75% better performance via crossfireX... when it works. there goes ATi and their funky driver policy again.

but anywhere, specs are starting to leak out about the 9800GTX, and it looks to be little better than an 8800GTS(G92). same core and virtually same specs. nVidia fanboys everywhee are sh**ing bricks. i think i'll hold off my judgement. i think nVidia is implementing a few technologies that don't show up on paper. the 9600GT performs impressively for its specs. too close to the 8800GT, IMO, to be purely about shaders and stream processors. i heard somewhere that nVidia focused development on higher resolutions more than anything else -- which will be good for me.

i don't think i'll get an upgrade this month as i'd planned. i do want to wait for the 9k series to roll out.




random question... i know you have to match RAM for best performance. does that include capacity? i mean, i've got a 1GB stick in two like-colored slots right now. if i put two 512MB sticks with the same specs (speed, latency, voltage), it won't hamper the other sticks, right?

random question 2... asus makes an 8800GT with 1GB of VRAM... is it worth it? it's about as much (a little more) than an 8800GTS(G92), i'm just wondering if, all other things being equal, it's better to have more VRAM or more stream processors.

though, i'm wondering if i wait long enough if i'll see a 1GB GTS(G92)...
Reply #315 Top
:( very upset!!! :( :( my main puter went down!!! Power Supply:(

I have a 550 Watt PS that went today and I`m not sure if it took out the motherboard too!!! so I`m Done for now :(

I had taken out 1g of ram thinking it was the issue now I`m not sure of that my puter has act,..up sense I put my 7600GS 512MB AGP Video Card in it which I though the 550 watt PS should have handle,... Considering it is not that old !!!

Zyxpsilon you my want to look into that card more before you by one,... my 5600fx 256mb
show all the ship graphics. not sure if the PS was Just going out or the video card and CPU was just to much for it but I though 550 Watts would have been enough !!!

I had lost my XP install on this puter and was trying to reinstall Vista I lost all my MODS to TA and DA :(

This set back done me in!!! Till I can get either a new PS or a motherboard for my new PC i wanted to build I have no computer :(
I`ll try to figure out how to get my updates but I lost all those that I did have which I was going to burn to CD today had about 20g had form DL threw TA.
Take Care All see you when I`m back UP!!

Nasty :(
Reply #316 Top
oh man, that sucks. i'm sorry to hear that.
Reply #317 Top
UPDATE

I`m back up sort of i`m reformatting and saving as much as i can i was trying to be ready for tomorrow just in case we got a update but i`m not going to make it !!!

going for a 600 watt PS if it is still available I think i`m ok on all my mods stuff well know more later !!!!

Nasty I should have burn to cd LOL

Edited

The POwer Supply fail again so I`m down for now :(

Nasty :(

Edited again April 15 2008

Will I`m getting some back on taxes I was not expecting to get SO.....
I`ll be back up shortly :)
looks like 700 Watt PS for replacement !!!
Reply #318 Top
MotherBoard I`m thinking of getting for the parts I have already


ASUS CROSSHAIR Socket AM2 NVIDIA nForce 590 SLI MCP ATX AMD Motherboard

parts I have

CPU AMD Athlon 64X2 4600

2 EVEA geforce 7600 GS 512 Video Cards I know that the fastest but they are a year old now !!

4g ram hyperx 800mhz.


Nasty
Reply #319 Top
If you keep losing the PSU, then the motherboard is a likely culprit.
Reply #320 Top
If you keep losing the PSU, then the motherboard is a likely culprit.



no just the one i was able to restart and use same PS till it over heated the second time I shut down PC and stop using it so i would not burn it up

after changing PS if it does it again then I`ll have to take it dwon then decide if i`m goping to replace motherboard or build other new pC !!!

also after looking at the ASUS CROSSHAIR board it looks to be the one i`m getting !!!
Reply #321 Top
Overheating indicates a heavy current draw and/or no longer functioning fan, or, you need to clean the crud out of the PS. Of course, with a new PS, this does not apply.
Reply #322 Top
Overheating indicates a heavy current draw and/or no longer functioning fan, or, you need to clean the crud out of the PS. Of course, with a new PS, this does not apply.


Sarissi Thanks for the Help !! but I clean the crud made sure the fans work(there is 2 fans) and though it was OK then it over heated again !! not sure why it does not do it every time but I not willing to burn the house down to try and find out LOL so for about 50 USD I can change the PS and watch the PC to see if it over heats the PS I monitor temps and fans and all my PC so I`m not sure why it is doing it!!

Once again Thanks for taking the time to Help.
Nasty
Reply #323 Top
You have to take the cover off the PS for this.

If you still get an overheat problem, then the mobo may be at fault.
Reply #324 Top
You have to take the cover off the PS for this.If you still get an overheat problem, then the mobo may be at fault.


yeah Sarissi I did get into the PS that way Clean it really good and then put it back together what I found out last night is that the fan is barely turning so a replacement fan in the PS should make it OK!!,.....But I have a replacement on the way tho!! :)

apparently the fan was working sometimes and not other times !!

So Sarissi and any one else what you think of the ASUS CROSSHAIR MOBO ???

Nasty
Reply #325 Top
!!So Sarissi and any one else what you think of the ASUS CROSSHAIR MOBO ???Nasty


nice board, but it has one major design flaw. it doesn't support an intel proc. ;)

i myself went with the maximus formula se. very strong board and the clocking features totally blew me away.

i can take my e6600 up to 5.5ghz (liquid cooled), stable 24/7. but i generally
don't run it that fast (just when i feel a need to bench). 3.6ghz is the current setting i run.

i'm not too sure if you are into clocking, but the board you are going to get will change that... i hope :)