<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
	<channel>
		<title><![CDATA[Redump Forum — Merge program for CD and DVD images]]></title>
		<link>http://forum.redump.org/topic/10327/merge-program-for-cd-and-dvd-images/</link>
		<atom:link href="http://forum.redump.org/feed/rss/topic/10327/" rel="self" type="application/rss+xml" />
		<description><![CDATA[The most recent posts in Merge program for CD and DVD images.]]></description>
		<lastBuildDate>Wed, 21 Mar 2012 21:22:23 +0000</lastBuildDate>
		<generator>PunBB 1.4.4</generator>
		<item>
			<title><![CDATA[Re: Merge program for CD and DVD images]]></title>
			<link>http://forum.redump.org/post/37665/#p37665</link>
			<description><![CDATA[<p><span class="postimg"><img src="http://i1199.photobucket.com/albums/aa480/tossEAC/ResidentEvil-CDGroupv031vsCDGroupv042.png" alt="http://i1199.photobucket.com/albums/aa480/tossEAC/ResidentEvil-CDGroupv031vsCDGroupv042.png" /></span></p><p><span class="postimg"><img src="http://i1199.photobucket.com/albums/aa480/tossEAC/ResidentEvil4-CDGroupv031vsCDGroupv042.png" alt="http://i1199.photobucket.com/albums/aa480/tossEAC/ResidentEvil4-CDGroupv031vsCDGroupv042.png" /></span></p><p><span class="postimg"><img src="http://i1199.photobucket.com/albums/aa480/tossEAC/StarWars-RogueSquadronII-RogueLeader-CDGroupv031vsCDGroupv042.png" alt="http://i1199.photobucket.com/albums/aa480/tossEAC/StarWars-RogueSquadronII-RogueLeader-CDGroupv031vsCDGroupv042.png" /></span></p><p><span class="postimg"><img src="http://i1199.photobucket.com/albums/aa480/tossEAC/TigerWoodsPGATour2004-CDGroupv031vsCDGroupv042.png" alt="http://i1199.photobucket.com/albums/aa480/tossEAC/TigerWoodsPGATour2004-CDGroupv031vsCDGroupv042.png" /></span></p><p>Note: Tiger Woods PGA Tour 2004, tried to diff, but no diffs were created, and as I expected, my compression on the old set was identical in size, so nothing gained, nothing lost.</p><p><span class="postimg"><img src="http://i1199.photobucket.com/albums/aa480/tossEAC/TalesofSymphonia-CDGroupv031vsCDGroupv042.png" alt="http://i1199.photobucket.com/albums/aa480/tossEAC/TalesofSymphonia-CDGroupv031vsCDGroupv042.png" /></span></p><p><span class="postimg"><img src="http://i1199.photobucket.com/albums/aa480/tossEAC/StarFoxAdventures.png" alt="http://i1199.photobucket.com/albums/aa480/tossEAC/StarFoxAdventures.png" /></span></p><p><span class="postimg"><img src="http://i1199.photobucket.com/albums/aa480/tossEAC/MarioParty4.png" alt="http://i1199.photobucket.com/albums/aa480/tossEAC/MarioParty4.png" /></span></p><p><span class="postimg"><img src="http://i1199.photobucket.com/albums/aa480/tossEAC/OVERALLSAVINGS.png" alt="http://i1199.photobucket.com/albums/aa480/tossEAC/OVERALLSAVINGS.png" /></span></p>]]></description>
			<author><![CDATA[null@example.com (tossEAC)]]></author>
			<pubDate>Wed, 21 Mar 2012 21:22:23 +0000</pubDate>
			<guid>http://forum.redump.org/post/37665/#p37665</guid>
		</item>
		<item>
			<title><![CDATA[Re: Merge program for CD and DVD images]]></title>
			<link>http://forum.redump.org/post/37664/#p37664</link>
			<description><![CDATA[<p>Also, zip should not be smaller than 7z, unless maybe zip deals with incompressible material better and the stuff being compressed is mostly incompressible (perhaps xbox padding).</p><p>I think your right, but since the Diffing stage, its actually beeter your way, before diffing was introduced your right zip handled the garbage better, in ngc and probably Wii we would have seen this.</p><p>Your new way is the best so far, for maximum savings. I am currently re doing all of those ngc to see how much is saved over a larger field.</p>]]></description>
			<author><![CDATA[null@example.com (tossEAC)]]></author>
			<pubDate>Wed, 21 Mar 2012 20:27:28 +0000</pubDate>
			<guid>http://forum.redump.org/post/37664/#p37664</guid>
		</item>
		<item>
			<title><![CDATA[Re: Merge program for CD and DVD images]]></title>
			<link>http://forum.redump.org/post/37663/#p37663</link>
			<description><![CDATA[<p>Ok. It might be best to only use t7z with both old and new files for comparison purposes (or at least the same compressor for both), otherwise the comparison doesn&#039;t mean much. The new version uses t7z to create the .7z file.</p><p>Also, zip should not be smaller than 7z, unless <em>maybe</em> zip deals with incompressible material better and the stuff being compressed is mostly incompressible (perhaps xbox padding).</p>]]></description>
			<author><![CDATA[null@example.com (jamjam)]]></author>
			<pubDate>Wed, 21 Mar 2012 18:33:14 +0000</pubDate>
			<guid>http://forum.redump.org/post/37663/#p37663</guid>
		</item>
		<item>
			<title><![CDATA[Re: Merge program for CD and DVD images]]></title>
			<link>http://forum.redump.org/post/37661/#p37661</link>
			<description><![CDATA[<p>Resident Evil - CDGroupv0.3.1 vs. CDGroupv0.4.2 - (built-in compression) vs. (my own custom compression)</p><p>---------------------------------------------------------------------</p><p>6 ngc raw isos = <strong>8.15 GB</strong></p><p>6 ngc pakkiso&#039;d = <span style="color: green"><strong>4.79 GB</strong></span> = 3.36 GB saved over raw iso</p><p>6 ngc raw CDGroupv0.3.1 = <span style="color: red"><strong>4.85 GB</strong></span> = nothing saved over pakkiso</p><p>6 ngc compress&#039;d CDGroupv0.3.1 = <span style="color: lime"><strong>3.59 GB</strong></span> = 1.26 GB saved over pakkiso</p><p>6 ngc compress&#039;d (built-in compression) CDGroupv0.4.2 = <span style="color: lime"><strong>2.14 GB</strong></span> = 1.45 GB saved over custom compress&#039;d CDGroupv0.3.1</p><p>6 ngc raw (uncompress&#039;d) CDGroupv0.4.2 = <span style="color: lime"><strong>2.85 GB</strong></span> = 2.00 GB saved over raw CDGroupv0.3.1</p><p>6 ngc compress&#039;d (my own custom compression) CDGroupv0.4.2 = <span style="color: red"><strong>2.21 GB</strong></span> = nothing saved over built-in compression. You win jamjam. The fact I saved a bit more on TimeSplitters was, it wasn&#039;t a good example. I prefer the compression you are using, even if its not the samllest size every single time (bad examples included), it still wins it for me <img src="http://forum.redump.org/img/smilies/smile.png" width="15" height="15" alt="smile" /></p><p>---------------------------------------------------------------------</p><p>Biohazard (Japan) (Disc 1) md5 20CB8D4CB322AA503D1B8A49C43CDEBF</p><p>Resident Evil (Europe) (En,Fr,De,Es,It) (Disc 2) md5 457944F833FC2F5E8FF394CFDF2E1B7C</p><p>Resident Evil (USA) (Disc 2) md5 7DEFD099E98944BC93684D4733BFE68B</p><p>Resident Evil (USA) (Disc 1) md5 BDD0FE3848C4AB1441DC6C9EE209426B</p><p>Biohazard (Japan) (Disc 2) md5 BFBF8E0F249CF8DD8FCB913793301A8C</p><p>Resident Evil (Europe) (En,Fr,De,Es,It) (Disc 1) md5 C581FAB5FD10F55B76188E86194199C1</p><p>---------------------------------------------------------------------</p><div class="codebox"><pre><code>CDGroup v0.4.2

CDLibrary v0.4.2


Processing &#039;2048&#039;

 Grouping &#039;2048&#039; (6 files)
  Hashing &#039;Biohazard (Japan) (Disc 1).iso&#039;
  Hashing &#039;Biohazard (Japan) (Disc 2).iso&#039;
  Hashing &#039;Resident Evil (Europe) (En,Fr,De,Es,It) (Disc 1).iso&#039;
  Hashing &#039;Resident Evil (Europe) (En,Fr,De,Es,It) (Disc 2).iso&#039;
  Hashing &#039;Resident Evil (USA) (Disc 1).iso&#039;
  Hashing &#039;Resident Evil (USA) (Disc 2).iso&#039;
  Sorting sectors within images
  Merging image sector hashes
  Counting repeated sectors
  Create map from images to merged files
  Writing 2048.hsn
  Writing &#039;2048.hsm&#039;
  Writing hs1.2048
  Writing hs2.2048
  Writing hs3.2048
  Writing hs4.2048
  Writing hs5.2048
  Writing hs6.2048
 Group of 2048 byte sectors successful

&#039;2048&#039; successfully processed

Doing external diff on 2048 byte/sector files
  Diffing to &#039;hs2.2048&#039;
  Diffing to &#039;hs3.2048&#039;
  Diffing to &#039;hs4.2048&#039;
  Diffing to &#039;hs5.2048&#039;
  Diffing to &#039;hs6.2048&#039;
Diff successful

Compressing files


torrent7z_0.9.1beta/Thu Jul 23 03:08:33 2009
using 7-Zip (A) 4.65  Copyright (c) 1999-2009 Igor Pavlov  2009-02-03

Scanning

Creating archive ngc\ngc.7z.tmp

Compressing  ngc\hs1.2048
Compressing  ngc\hs2.diff.2048
Compressing  ngc\hs3.diff.2048
Compressing  ngc\hs4.diff.2048
Compressing  ngc\hs5.diff.2048
Compressing  ngc\hs6.diff.2048
Compressing  ngc\2048.hsm
Compressing  ngc\2048.hsn

Everything is Ok

External compressor seems to have completed successfully

Size of original images: 8759869440 bytes
Size of merged uncompressed files: 5216682706 bytes (~59 % of original images)
Size of merged + diffed files: 3066362234 bytes (~35 % of original images)
Size of merged + diffed + compressed files: 2308632971 bytes (~26 % of original
images)

Time taken to group: 0 hours 10 minutes 0 seconds
Time taken to diff: 0 hours 13 minutes 52 seconds
Time taken to compress: 0 hours 22 minutes 58 seconds

CDGroup completed in 0 hours 46 minutes 51 seconds

Press any key to continue . . .</code></pre></div><p>NOTE:</p><p>CDGroupv0.4.2&#039;s size of files = CDGroupv0.3.1 before Diffing and before compression (4.85 GB).</p><p>CDGroupv0.4.2&#039;s size of files after Diffing and before compression = 2.85 GB (42%) smaller after Diffing. <img src="http://forum.redump.org/img/smilies/smile.png" width="15" height="15" alt="smile" /></p><p>I&#039;m going to have fun with custom compression - on those sizes. <img src="http://forum.redump.org/img/smilies/smile.png" width="15" height="15" alt="smile" /> or not <img src="http://forum.redump.org/img/smilies/sad.png" width="15" height="15" alt="sad" /></p>]]></description>
			<author><![CDATA[null@example.com (tossEAC)]]></author>
			<pubDate>Wed, 21 Mar 2012 18:21:25 +0000</pubDate>
			<guid>http://forum.redump.org/post/37661/#p37661</guid>
		</item>
		<item>
			<title><![CDATA[Re: Merge program for CD and DVD images]]></title>
			<link>http://forum.redump.org/post/37659/#p37659</link>
			<description><![CDATA[<p>I dont understand all that.</p><p>I tested Timeslpitters (2 Discs).</p><p>When I say I got smaller size with the old version, I meant after packing my old version.</p><p>With the new version it made all the files ok, but I was reffering to after your programme packed it into one single 7z.</p><p>I guess the reason the old version was smaller was because I used better (more suitable comprssion) torrent7z is know to produce larger files than pakkiso, and I used custom pakkiso compression. I also used torrent zip, as this seemed to get the smallest size on every file except gs0.</p><p>I also tested on Metal Gear, the one I chose to leave uncompressed, their was a small saving using the new version as compared to the old version left uncompressed.</p><p>I will further test, for sure, If I get the same thing happed on another test old is better than new, I will unpak your single 7z and pakk it using the compression I found worked the best with the old version, and post those findings.</p>]]></description>
			<author><![CDATA[null@example.com (tossEAC)]]></author>
			<pubDate>Wed, 21 Mar 2012 15:59:20 +0000</pubDate>
			<guid>http://forum.redump.org/post/37659/#p37659</guid>
		</item>
		<item>
			<title><![CDATA[Re: Merge program for CD and DVD images]]></title>
			<link>http://forum.redump.org/post/37645/#p37645</link>
			<description><![CDATA[<p>Are you sure you used the same compressor + settings as the one you used before?</p><p>If so, how much bigger is the result?</p><p>Is both a h&lt;a&gt;.diff.&lt;b&gt; and h&lt;a&gt;.&lt;b&gt; present after program completion, where &lt;a&gt; is some number, and &lt;b&gt; is some number?</p><p>If true, then there was a problem with cleanup midway through program execution. &#039;h2.2048&#039; (an example of h&lt;a&gt;.&lt;b&gt;) is the sector store for the second image. &#039;h2.diff.2048&#039; would be the diff from &#039;h1.2048&#039; to &#039;h2.2048&#039;. Only one of these need to be stored to be able to rebuild the second image.</p><p>Otherwise, is a h&lt;a&gt;.diff.&lt;b&gt; present after program execution?</p><p>If false, then the diff stage was found to not be beneficial. In this case, at worst the merged files should only be marginally bigger.</p><p>If true, then for at least one image, diffing was deemed beneficial before compression (the &#039;diff&#039; file was at most 7/8 the size of the &#039;to&#039; file). It is possible that the compressed &#039;diff&#039; file is bigger than the compressed &#039;to&#039; file, but given the initial size difference seems unlikely (it may be true if using freearc on a smallish set of data like gaijin did before, but again this is unlikely to scale to bigger sets/images).</p><p>In any event, please be more descriptive in your testing. Let me know the os, files used in the test, files present after execution had completed, the text output to screen during execution, etc. In my tests, the resulting size is at worst about the same as the old result. At best the file size is much smaller.</p>]]></description>
			<author><![CDATA[null@example.com (jamjam)]]></author>
			<pubDate>Mon, 19 Mar 2012 00:44:03 +0000</pubDate>
			<guid>http://forum.redump.org/post/37645/#p37645</guid>
		</item>
		<item>
			<title><![CDATA[Re: Merge program for CD and DVD images]]></title>
			<link>http://forum.redump.org/post/37643/#p37643</link>
			<description><![CDATA[<p>I did some quick tests.</p><p>It seemed I was able to get the size smaller with my own compression and the old version, than this new version was able to get.</p>]]></description>
			<author><![CDATA[null@example.com (tossEAC)]]></author>
			<pubDate>Sun, 18 Mar 2012 21:49:27 +0000</pubDate>
			<guid>http://forum.redump.org/post/37643/#p37643</guid>
		</item>
		<item>
			<title><![CDATA[Re: Merge program for CD and DVD images]]></title>
			<link>http://forum.redump.org/post/37523/#p37523</link>
			<description><![CDATA[<p>Will test soon, on something from above to see how it compares to the old format.</p>]]></description>
			<author><![CDATA[null@example.com (tossEAC)]]></author>
			<pubDate>Fri, 09 Mar 2012 11:49:05 +0000</pubDate>
			<guid>http://forum.redump.org/post/37523/#p37523</guid>
		</item>
		<item>
			<title><![CDATA[Re: Merge program for CD and DVD images]]></title>
			<link>http://forum.redump.org/post/37520/#p37520</link>
			<description><![CDATA[<p>New in v0.4.2:<br /></p><ul><li><p>New merged format</p></li><li><p>New stage (external diff)</p></li><li><p>Eliminated ram usage dependency on maximum image size</p></li><li><p>Multiple passes removed when grouping</p></li><li><p>Ram limit removed when ungrouping (obsolete)</p></li></ul><p>The new format aims to be more compressible. The differences from the old format are:<br /></p><ul><li><p>All extensions are now .h* instead of .g* (for clarity)</p></li><li><p>There is no longer a repeated sector store (gs0). Instead, a repeated sector is stored in the first location it is present within the image stores (hs1, hs2 ...) (keep related data together)</p></li><li><p>hsm files are stored differently (more compressible)</p></li><li><p>Sector storage extension and name is swapped (may help some external compressors order sensibly by extension)</p></li></ul><p>The new external diff stage is to remove repeated data that is not aligned to sector boundaries (shifted data for example). For some inputs this stage plays a good role in crunching down the file size, in others the previous grouping stage did most of the leg work.</p><p><a href="http://www.mediafire.com/?p5n8i6pj0gz1nd0">CDGroup v0.4.2</a></p>]]></description>
			<author><![CDATA[null@example.com (jamjam)]]></author>
			<pubDate>Fri, 09 Mar 2012 09:17:24 +0000</pubDate>
			<guid>http://forum.redump.org/post/37520/#p37520</guid>
		</item>
		<item>
			<title><![CDATA[Re: Merge program for CD and DVD images]]></title>
			<link>http://forum.redump.org/post/37437/#p37437</link>
			<description><![CDATA[<p>It&#039;s possible to implement extraction of a single image. However the current direction I&#039;m heading in making the data more compressible means more of the files are needed to extract a single image (the first bullet point on the todo list above). This makes single image extraction fine for uncompressed files, but if they&#039;re compressed it&#039;s either messy (don&#039;t delete extracted working files) or wasteful (extracting multiple single images, deleting the working files each time means multiple decompression of the same file).</p><p>Something I&#039;m toying with is having an external diff stage after grouping and before external compression. This could really reduce the benefit of single image extraction (if it takes 90% of the effort to extract a single image, you might as well go the last 10% to get the rest).</p>]]></description>
			<author><![CDATA[null@example.com (jamjam)]]></author>
			<pubDate>Sat, 03 Mar 2012 18:58:40 +0000</pubDate>
			<guid>http://forum.redump.org/post/37437/#p37437</guid>
		</item>
		<item>
			<title><![CDATA[Re: Merge program for CD and DVD images]]></title>
			<link>http://forum.redump.org/post/37436/#p37436</link>
			<description><![CDATA[<p>I use -Xms flag, -Xmx don&#039;t work for me.</p><p>And a little more tests:</p><p><strong>Crisis Core - Final Fantasy VII (EN+US+US+DE+IT+ES+FR+JP+JP) (PSP)</strong> (14.6 GB) </p><p>FreeArc raw 9 isos = around 7-8 GB (maybe solid not work for big size)</p><p>CDGroup + packed&nbsp; FreeArc = 2.19 GB</p><p>maniac version&nbsp; - EU -&gt; convert patches + packed FreeArc -&gt; US+US+DE+IT+ES+FR+JP+JP = 2.16 GB (I made them early)</p><p>FreeArc totally lost <img src="http://forum.redump.org/img/smilies/smile.png" width="15" height="15" alt="smile" /></p><p><strong>Resident Evil (USA+Europe+Japan) (GameCube)</strong> (8.15 GB)</p><p>FreeArc raw 6 isos = 3 GB</p><p>CDGroup + packed&nbsp; FreeArc = 2.06 GB</p><br /><p>There is a request if possible to make ability to extract one needed file, not unmerge all.</p>]]></description>
			<author><![CDATA[null@example.com (gaijin)]]></author>
			<pubDate>Sat, 03 Mar 2012 10:40:03 +0000</pubDate>
			<guid>http://forum.redump.org/post/37436/#p37436</guid>
		</item>
		<item>
			<title><![CDATA[Re: Merge program for CD and DVD images]]></title>
			<link>http://forum.redump.org/post/37429/#p37429</link>
			<description><![CDATA[<p>Looks like you ran out of ram because the piece size was too small for the size of the largest file being grouped. If you didn&#039;t actually run out of ram (just the ram java had access to), try executing with the -Xmx flag like so:<br />java -Xmx1000m -jar CDGroup.jar ... (to give the program 1000MB of ram to work with for example).</p><p>Note this is partly why multiple passes are not recommended, nor is an initial pass of anything other than 1. They eat ram when given a large enough file and / or small enough piece size. The standard settings should allow most people to merge most common file sizes (given 512MiB of ram it should easily handle full dual layer DVD sized images). Multiple passes will probably be removed. Initial pass may be kept in for experimentation.</p><p><strong>How multiple passes works (and why you shouldn&#039;t use it)</strong></p><p>The binary result from one pass is the input for the next. The only benefit (compared to changing the initial pass and keeping passes at 1) is decreased overhead (the size of the combined gsm files will be smaller). But multiple passes has disadvantages that far outweigh it:<br /></p><ul><li><p>After each pass, sectors are more &#039;mixed&#039; than they were before</p></li><li><p>Maximum concurrent ram usage doubles wrt the biggest file after each pass</p></li><li><p>The same data is being grouped again and again in a nested fashion</p></li></ul><p>The concurrent ram usage is where there&#039;s a real problem. Maximum concurrent ram usage in bytes is estimated as the number of pieces in the biggest file being merged multiplied by 48 (plus some overhead). With normal settings (sector matching), a 1Gib file would have 524288 pieces, taking up roughly 25Mib of ram (non-contiguous). This is fine for any sensible image size.</p><p>Taking 2048 as an example, for each successive pass the piece size is halved so the number of pieces the file is split into has doubled. The program ran out of memory at pass &#039;256&#039;, where maximum concurrent ram usage would be roughly 8*25 = 200MiB per GiB of the maximum file size. Using passes instead of initialpass makes this worse, as a large file could have been made earlier, increasing concurrent ram usage further.</p><p><strong>Freearc testing</strong></p><p>As for your freearc testing, the only reason I can see for the result is that the repeated sectors are pulled out of where they are and placed in gs0. This may mess with a compressor as it could put unrelated sectors close together (although the sectors are ordered as naturally as possible within the gs0). Something else to consider is that as the images are small, freearc may have been able to see matching sectors from two images next to each other. It seems unlikely that the freearc performance would scale to bigger images (like full DVD images for example).</p><p>There are some things I can see to improve compressibility (or at least the chances of improved compressibility):<br /></p><ul><li><p>Instead of pulling repeated sectors out to gs0, remove them from everywhere except the first location they are present</p></li><li><p>Swap the name with the extension in the resulting files. I know freearc can order files any which way, but t7z orders them by extension, which is possibly the worst way to order them with the files named as they currently are (this would only affect merges with CDs with multiple sector types, which currently are compressed in the order 2048.gs0, 2324.gs0, 2048.gs1, 2324.gs1 ...)</p></li><li><p>Store gsm file in a different manner (gsm is the overhead when merging). There&#039;s a way to store the gsm such that it is the same size and does the same thing, but is more compressible</p></li></ul><p>If the first is implemented it seems very unlikely that freearc will produce a smaller file from raw images than the merged images.</p><p>Anything else that anyone thinks can improve compressibility let me know here or in pm.</p>]]></description>
			<author><![CDATA[null@example.com (jamjam)]]></author>
			<pubDate>Fri, 02 Mar 2012 13:31:09 +0000</pubDate>
			<guid>http://forum.redump.org/post/37429/#p37429</guid>
		</item>
		<item>
			<title><![CDATA[Re: Merge program for CD and DVD images]]></title>
			<link>http://forum.redump.org/post/37428/#p37428</link>
			<description><![CDATA[<p>I tried use more number of passes and get this error</p><p><a href="http://iceimg.com/4f3b42f118991c.jpg.htm"><span class="postimg"><img src="http://iceimg.com/t/4f/3b/42f118991c.jpg" alt="http://iceimg.com/t/4f/3b/42f118991c.jpg" /></span></a></p>]]></description>
			<author><![CDATA[null@example.com (gaijin)]]></author>
			<pubDate>Fri, 02 Mar 2012 08:15:19 +0000</pubDate>
			<guid>http://forum.redump.org/post/37428/#p37428</guid>
		</item>
		<item>
			<title><![CDATA[Re: Merge program for CD and DVD images]]></title>
			<link>http://forum.redump.org/post/37422/#p37422</link>
			<description><![CDATA[<p>New version CDGroup v0.3.2</p><p>Differences from v0.3.1</p><ul><li><p>Old code cleanup. Some options (like deletefiles) removed</p></li><li><p>Improve cross-platform compatibility</p></li><li><p>Remove J7zip as a 7z decompressor (can&#039;t handle some 7z files, presumably solid archives)</p></li><li><p>External compressor / decompressor support (t7z.exe currently in there)</p></li></ul><p>Note: You can define your own external compressor / decompressor using options.ini</p><p><a href="http://www.mediafire.com/?bqw1erwr7ems29r">CDGroup v0.3.2</a></p>]]></description>
			<author><![CDATA[null@example.com (jamjam)]]></author>
			<pubDate>Thu, 01 Mar 2012 18:08:00 +0000</pubDate>
			<guid>http://forum.redump.org/post/37422/#p37422</guid>
		</item>
		<item>
			<title><![CDATA[Re: Merge program for CD and DVD images]]></title>
			<link>http://forum.redump.org/post/37420/#p37420</link>
			<description><![CDATA[<p>For me CDGroup gives a small gain after compression. I use raw images (or ecm for CD) + FreeArc (better solid compression than 7z). CDGroup can be used only for himself, not for share or quick access to images.</p><p>TEST: Lost Kingdoms II (USA+Europe+Japan) (1.36 + 1.36 + 1.36 GB)</p><p>Solid 7z raw isos = Lost Kingdoms II (USA+Europe+Japan).7z&nbsp; -&nbsp; 2678068917 (2.49 GB)</p><p>CDGroup + packed 7z = Lost Kingdoms II (USA+Europe+Japan).7z - 2676988096 (2.49 GB) </p><p>FreeArc raw isos = Lost Kingdoms II (USA+Europe+Japan).arc&nbsp; &nbsp;&nbsp; - 1685900893 (1.56 GB)</p><p>CDGroup + packed FreeArc = Lost Kingdoms II (USA+Europe+Japan).arc&nbsp; -&nbsp; 1706886824 (1.58 GB)</p><br /><p>If do patches USA-&gt;Europe-&gt;Japan will be even smaller (maniac version <img src="http://forum.redump.org/img/smilies/smile.png" width="15" height="15" alt="smile" />)</p>]]></description>
			<author><![CDATA[null@example.com (gaijin)]]></author>
			<pubDate>Thu, 01 Mar 2012 09:26:19 +0000</pubDate>
			<guid>http://forum.redump.org/post/37420/#p37420</guid>
		</item>
	</channel>
</rss>
