Winrar 3.9 released

32 bit

64 bit

96,936 views 37 replies
Reply #1 Top

I was definately happy to see em finally get a 64 bit version out.  Seems like a lot of devs are finally talking about releasing 64 bit apps too.

Reply #2 Top

Me too, runs nicely on vista and win7.

Reply #3 Top

Is there a reason to use WinRAR over 7zip?

Reply #4 Top

Yeah, I hate 7zip.

 

No really I find WinRAR more helpful and quicker to use, but to be honest I haven't ever given 7zip a chance.

Reply #5 Top

Well WinRAR has a sweet spot position at the moment, because between the low compression, high compatibility ZIP format and the high compression, low compatibility 7ZIP WinRAR is in between. However it has an above average compression ratio and an above average compatiblity. And by that I mean which programs support the format and how many home users actually know how to open them. However if you want to go for the highest compression, or the highest compatibility then the corresponding choices are best. For everything else, use WinRAR imo.

Reply #6 Top

7zip is only low compatibility because it's relatively new; the 7zip format is better than the RAR format, and 7zip works with both.  7zip is also free.  Being free means it's gaining penetration relatively fast - it took many years for rar to take off (and it never really pushed zip out of the space anyway).  7zip has also had a 64-bit version for some time, and is extraoridnarily lightweight.

Not seeing any real reason to use WinRAR.  We all used it back in the day, but archive software (and containers) are constantly passing the baton.  I'm not going to shed a tear for PKZIP and ARJ. :)

It's probably just me, but I can't think of WinRAR without being glad I don't have to put up with nag screens with 7zip. :)

Reply #7 Top

WinRAR is able to unpack 7z files.

I have been using 7zip for years and never had an issue with it. Quite a nice little program.
It is also LGPL you got to love that!

 

Reply #8 Top

LOL I haven't bought WinRAR but haven't seen a nag screen in my life. And no I don't use a crack :grin:

Reply #9 Top

Different question for you all: who still compresses files anyway?

I only use compression tools to make one big file of things (it's called Store mode in WinRAR). Wether some files end up being 100 MB or 120 MB I couldn't care less, compressing big files like movies and games is useless anyway because they are already internally compressed and I don't need a couple of MBs less when sending them to friends because my internet is fast enough to not care. Who here still really needs it at all?

Reply #10 Top

With such large drives people hardly use compression anymore.
I do use it for storage of my picture files but thats about it. Just to get that extra space on my burnt DVD's.
But otherwise no, why bother.

Reply #11 Top

Seriously.  What's more annoying is that almost everything available is compressed in some way, which takes xyz seconds/minutes to decompress... and generally isn't actually compressed much at all.  I downloaded the new vps for Freespace the other day and they're compressed... and some of them are seriously two meg smaller, yet it takes a step, an extra piece of software and time to decompress it.  It's absurd.  I keep an archive soft installed for this reason only; if Win7 could handle rars and 7z there'd be no reason to bother.

 

ALMOST as absurd as people still using the 'break archive into smaller files' function for no reason.  :)

Reply #12 Top
  1. Because the releases consists of small parts you don't have to worry about re-downloading the whole release if something goes wrong and a file gets corrupted.

  2. You can control that everything has been downloaded correctly by checking against the SFV-file. Hence you will always know whether you've gotten a complete uncorrupt release of what you were downloading.

    This means that you will have the exact same files on your computer, when you've downloaded and extracted the release, as the person who first created the release. This instead of downloading an extracted version of the file which perhaps has been transfered a couple of hundred times from one person to another and where there is an overwhelming risk of transfer errors. This doesn't mean that the file won't work, but it can lead to colourdeviations or so called freeze-frames.

  3. You can download from multiple sources at the same time - ensuring comformt and maximizing your download speed.

  4. We ge a standardized way of sharing.
Reply #13 Top

Quoting Pnakotus, reply 11

ALMOST as absurd as people still using the 'break archive into smaller files' function for no reason.  

:rofl: day of the 20 floppy for a backup is long gone (thank god)

Reply #14 Top

Quoting voidcore, reply 12

Because the releases consists of small parts you don't have to worry about re-downloading the whole release if something goes wrong and a file gets corrupted.


You can control that everything has been downloaded correctly by checking against the SFV-file. Hence you will always know whether you've gotten a complete uncorrupt release of what you were downloading.

This means that you will have the exact same files on your computer, when you've downloaded and extracted the release, as the person who first created the release. This instead of downloading an extracted version of the file which perhaps has been transfered a couple of hundred times from one person to another and where there is an overwhelming risk of transfer errors. This doesn't mean that the file won't work, but it can lead to colourdeviations or so called freeze-frames.


You can download from multiple sources at the same time - ensuring comformt and maximizing your download speed.


We ge a standardized way of sharing.

Save point 4 a BitTorrent client handles all those things internally. So in reality you are doing all those things twice. I'm never afraid of corrupt downloads. My torrent client will detect it and worst case scenario is I have to redownload 1 chunk which usually varies from 256KB - 4MB. Even better than compressing and partitioning files manually first.

A certain standard for sharing files might be handy but dividing files into smaller parts is not useful anymore. Besides I can redownload a 15MB faster than manually crc checking it.

Reply #15 Top

Oh we've got a live one.  I bet he checks hashes on everything he downloads too.  OVERWHELMING RISK OF TRANSFER ERRORS! :)

 

EDIT - By 'standarised way of sharing' he means 'the piracy SCENE enforces meaningless rules out of conformity'.  Turns out hashfails on individual blocks are detected and discarded by the client, as already mentioned?  :)

Reply #16 Top

You dont use torrents everywhere do you now.

Nothing wrong with standarization.

 

But i agree nowadays we dont need it as much as 5 years ago, we all have greater bandwidth and the torrent protocol have evolved and so have dc and whatever p2p apps.

Reply #17 Top

Sadly 'scenes' doing 'releases' love to enforce their 'rules', regardless of whether it's relevant anymore.

Hell, if you're doing bundling (which is a good application of archiving) there's no reason to apply any actual compression at all, or to use any given format over another.  Windows supports zip natively, so if it's for ease of download, put your pile o' files into an uncompressed zip for single-file download and forget about it.

Reply #18 Top

Quoting voidcore, reply 16
You dont use torrents everywhere do you now.

Maybe not but I cannot for the life of me remember when the last time was I had a corrupted download. Sure, a year of 4 back my RAM was broken, but then again so where the rar chunks, sfv files and everything else, so partitioning didn't help either.

The only time I can remember having to deal with it on a regular basis was back when I used a 56k, and connections were dropped regularly. Then again downloading a 720p release with that isn't viable anyway.

Cutting up big files is a thing of the past. Compression oftentimes costs you more time than it saves you. Time to move on. I don't even think for a second when downloading a 2GB  demo from some site. The worst that can happen is a slow server and I have to wait a while.

Reply #19 Top

I use compression when transferring large files between computers. I have a local gigabit lan, but when I copied my steamapps folder over to the other desktop, I compressed a 65 GB file folder down to 43 and transferred it...still took five minutes to send, but at least I wasn't sending 65 GB folder at 256 MB/s.

Reply #20 Top

256 MB/s you wish. That kind of defies the definition of gigabit :p

But it still takes a lot of time to compress 65GB, so what is the advantage to gain here? Unless you can compress on the fly, but I'm not sure if I could do that.

Reply #21 Top

I'll stick with WinRAR.  7zip is definately a good alternative, but I prefer the interface on WinRAR.  Already bought three copies and that should last me...forever.

Reply #22 Top

I'll stick to zip files. I don't hardly ever use any files that need to be zipped or unzipped.

Reply #23 Top

We weren't talking about compression formats, we were talking about applications to compress or uncompress them.  :p

Reply #24 Top

Well then I'll stick to 7zip. I was just pointing out that I don't mess with .zip or .rar files often if at all.

Reply #25 Top

If I was actually compressing I'd probably use the 7z format.  But yeah I only use it to decompress stuff usually, same as most peoples.