The Danger of Open Source

Linux broke my CD-ROM drive, now what?

First of all, I think open source is one of the greatest concepts of our time. I love the fact a community of dedicated programmers get together for a common goal. I love that open source technology is developed, for the most part, without capitalist intentions. It’s technology for the people, by the people. Wow, that was overboard. Anyway, with that said, there is one slight problem with open source.



It is my belief, and only my belief, that open source distribution companies do not have the resources to properly test what they are distributing. Some say, you get want you pay for. Others say that’s why there are warnings and tested hardware listed on the web pages. And some say, it’s not that big of a problem.



Software testing is a huge concern. Not only to software companies include testing in their budget, but many corporations that buy software do their own testing of a new product before it ever gets out company wide. This protects end users for security breaches, data corruption, and hardware failure.



While breaches and corruption are arguably superior in the eyes of most open source users, hardware failure has always been an issue for open source. Open source programs have broken hard drives, halted CPUs, and even recently, destroyed CD-ROM drives. The failures continue to arise throughout open source.



Stopping these failures will be one of the biggest hurdles in the open source movement. The main exploit of these failures is the lack of testing. But, who will step up to the role of tester? Hardware companies would be the obvious choice. However, they barely have enough time to test Windows. Plus, there is not enough of a market base to justify the extra testing in the case of most vendors. There are some exceptions in larger vendors, but even they don’t test all distributions. Software distributions would seem the next choice. However, the probability of them testing all the hardware out there for their distribution with their level of resources is simply not realistic. The conflict of testing and resources won’t be solved anytime soon. In the meantime, I would suggest only installing open source on hardware you are willing to lose.
44,625 views 7 replies
Reply #1 Top
How do you know it was Linux that broke your CD-ROM drive? Maybe it was destined to break and just happened to break while you were using Linux. Just a thought.
Reply #2 Top
I was skeptical of your blog until I came across this: http://www.newsforge.com/hardware/03/10/27/0218209.shtml?tid=81&tid=87

I assume you were using Mandrake 9.2. A flaw that results in broken hardware is serious. I stopped using Linux on my computer a while back due to lack of features and hardware support. Because of this news, I definitely will not be installing Linux on my computer again. I will be installing Linux on an older computer. I suppose Linux is just a toy to play around with on older computers where it won't be such a big financial lost if they get broke.
Reply #3 Top
I must say that I agree with almost everything in your article. Software testing is extremely important. That said, I think you are relying too much on the distro company for support and not on yourself.

In open source, most of the onus is on the user to find out whether they have the hardware necessary in order to support the software they are trying to install. There are plenty of resources out there (Freshmeat, The Register and the user forums at the actual distro companies themselves). Second, Linux is probably tested on more configurations than Windows could ever be.

Linux betas get downloaded by users, installed and tested and improved and over time, the distro is ready for release. User reports come in telling not only the distro company, but the users as well, that a certain piece of hardware will or won't work. I know, you are thinking "I have the hardware necessary." You may, but blaming Linux for busting your hard drive, then saying the distro company you purchased it from didn't test it adequately is petulant at best. I agree with the first poster here, it was probably your hard drive's "time." Remember, hard drives aren't guarnteed to work, they are guaranteed to fail.

Steve
Reply #4 Top
Just read the article referenced by TechCat. Nowhere in it does it say to hide your hard drives, just certain LC CD-ROM drives in certain computers.

Steve
Reply #5 Top
Actually he said that it was his CD-ROM drive not his hard drive that got broke. I feel Linux is most likely the cause due to the news I linked to above. It turns out it isn't just a Mandrake flaw, but it is a flaw that is in the newest build of the Linux kernel, so you can expect to see more CD-ROM drives getting fried by other distros if this problem is not fixed before other distros use this current build of the kernel.
Reply #6 Top
The problem was with LG drives, and with the Mandrake 9.2 install.

the problem, as it turns out, was Mandrake, silly them - issuing little used, but still STANDARD atapi commands to the device. (The cache flushing command, IIRC.)

Commands which LG, in a 'creative' use of design laziness and tacking shortcuts, used to flash the drives firmware. Mandrake quicky stepped up and corrected the issue - but the sad thing is, the problem wasn't with Mandrake all along, it was with LG's shoddy firmware.
Reply #7 Top
Linux blows! Linux killing hardware?... so much for OS maturity. Maybe it will be user ready in 2014, maybe not.