176

(43 replies, posted in General discussion)

thank you cHrI8l3, that does sound intriguing

so in 'Arc 6' ther's ECMs of all 8 images -> diff
decompression and reverse of diff on all 8 images would complete in 7-8 minutes, leaving ECMs, right?

can you make a filter chain then, that would produce one ECM and diffs, alike to 'ImageDiffs+ECM+7z' ?

177

(43 replies, posted in General discussion)

ok, i'll clean up hdd, fetch those images and test for myself
asap

symmetric compressors are used in NanoZip

i thought it's the same program, like mode or something
so ther's 2?
what about records with FreeArc+NanoZip, then? are they compressed twice, like zip+arj?

edit:
i'm basing my statments on what you write
i've asked you about only comparable value in that table - whether it's slower, and you said it is
that 'fast' part you added later
and after that you wrote that FA extract in 7 minutes,
...about the same as compression value given ~10 (7 is about 10 - 7 approximates to 10) = symmetrical
would you have wroten  - FA extracts in 1 minute, i'd think it's fast (as 7z extracts in 2),
but you wrote - in seven

so what would be decompression speed difference of exactly same files 7z vs FA?
have you tested?

edit:

i don't mean that 7z is ultimate archiver btw.
if ther's one faster at same or better ratio (like TAK vs APE & FLAC) - by all means

so if FreeArc does that - it's great
if you can talk author into integrating ecm then - it's even better

178

(43 replies, posted in General discussion)

unecm "Final Fantasy IX (E)(Disc 1 of 4)[SLES-02965].bin.ecm" ~75 sec
imadiff "Final Fantasy IX (E)(Disc 1 of 4)[SLES-02965].bin" into version "(F)(Disc 1 of 4)[SLES-02966]" ~5 min

are you sure it's 5 minutes?
i have whole PCE set on ImageDiff - it's always a matter of seconds on my machine
those are small ImageDiffs, however - rarely larger than 10 megabytes

md5 on 600mb file (fsum.exe) completes in ~20 seconds

if ImagePatch is slow it's implementation fault then, as in theory, it should be faster than unecm
ecm has a map of sectors where to recreate ecc
all program like ImagePatch needs, imho, is map of sectors to insert and those sectors themselves
it's practically 'copy /b file1+file2 file3'

as you said yourself FreeArc use symmetric compression algorithm which 7z does not
i don't think it's ultrafast - it can never be faster than 7z
it can compress better or be more convinient
but same files should extract faster from 7z

yes i'm theorizing
i'm out of space now and basically can not check anything so i have to trust you

so time results you provided earlier

unpack 650mb from 350mb 7z ~80 seconds

what are they from?

i mean if it's single file (or two) extracted from 7z containing one .ecm and ImageDiffs,
whole archive should decompress only little slower as it is solid most likely

so what we have is .ecm and .ImageDiffs in 2 minutes
vs one ecm (or several) in 7 minutes

now if ImagePatch is bottleneck here it should be possible to replace it with less complex patcher and it should zap

or i don't understand something?

- rebuilded data is stored in cache and then written onto hdd

ok, maybe you think author will implement ecm and joining of files in FreeArc and everything will be done in RAM (5gb)
and then written to HDD so time on HDD access will be saved
but it's overkill imho - 5gb of RAM is not sane and probably same could be done then with ramdisk

i mean it's all good to have all those features in one program but decompression algorithm itself is way slower
so that's where it ends, imho

edit:
i don't mean that 7z is ultimate archiver btw.
if ther's one faster at same or better ratio (like TAK vs APE & FLAC) - by all means
i just don't think that FreeArc is the case unfortunately

179

(43 replies, posted in General discussion)

current stats:

"Sony PlayStation (2484) (2009-04-05 15-16-43).dat"
Records in DB total: 2484
Records with Audio : 774
------- Size -------
Total: 1298980571256
Data : 1169375041272
Audio: 129605529984

"Sega CD - Mega-CD (95) (2009-02-23 19-41-59).dat"
Records in DB total: 95
Records with Audio : 94
------- Size -------
Total: 43296314544
Data : 25283397888
Audio: 18012916656

audio makes only 10% of PSX data actually,
though it's off a little, since lately i submitted about 100 CDs without CDDA by selection
and others might be doing similar since CDDA dumping is such a drag
but anyway, for older consoles it's far more significant
so i guess extraction from PackIso might be slightly faster than from merged set on average for PSX

edit:
PackIso is locked @7z 4.53 which is unfortunate

4.54 beta      2007-09-04
-------------------------
- Decompression speed was increased.

4.58 beta      2008-05-05
-------------------------
- Some speed optimizations.

and it uses default settings, hence producing less compression,
but as i understand it compression mode for 7z does not significantly influence decompression speed
so i don't know - it's difficult to tell without measuring

edit:
i mean extraction by single title
whole merged set should decode a lot faster
but it's unlikely scenario, imho

180

(43 replies, posted in General discussion)

unpack 650mb from 350mb 7z ~80 seconds

is this FreeArc?

ok, so i guess, it's how fast single version from 378mb FreeArc would extract
could you test extraction of .ecm + one .imageDiff from 357mb .7z and 344mb .nz, please then
unecm would be the same, and ImagePatch would be slightly slower than joining, i guess

edit:
noooo, wait
ummm it says 7z

that's what i thought:
1st: FreeArc - prettyslow
2nd: 7z + ImageDiff - fast

i mean when you have files decoded from ImageDiff .7z you'd still beat FA by about 4-5 minutes
plus compression is better

181

(43 replies, posted in General discussion)

Ive been trying to explain that FreeArc will be soon able to UnECM (+ Join) data automatically after decompression - so the only thing user will need to do is click on "Decompress" to restore original .bin's, without further processing files...

ok, so it will be more convenient
but it's possible to make an frontend or a script for commandline applications - many do that now
there are quite a lot of programs (for cd recording, video/audio encoding/processing and so on)
that actually are only GUIs for command line *nix applications

unpack 5.2gb (8 files) from 380mb FA archive ~7 minutes
unpack 650mb from 350mb 7z ~80 seconds
unpack 8 x 650mb from 2gb 7z ... 8*80 = big_smile

but for your example it's 2nd case, right?
there wouldn't be 2gb archive, it would be 350mb.
so ImageDiff would have 5 minutes to complete in, and actually it shouldn't be much slower than joining files.
(it basically is joining of files)

edit:
so anyway such merged set (7z+ecm+ImageDiff+tak) seems very good idea to me
from example above it's 8 times compression improvement over PackIso
it would be rare of course to have so many versions merged
but generally it would still save a lot of time and space
(even single title decompression from this set on average shouldn't be slower than PackIso, i think,
because of speed gain from TAK and would TAK implement support for RAW audio data
sox.exe could be eliminated, as it's used now to add/remove RIFF header,
which means every audio track is basically copied after extraction,
which is about the same as doing ImageDiff)
kudos to cHrI8l3 for that

182

(43 replies, posted in General discussion)

oh, i see - if you'd extract single file from archive
for ImageDiff you'd need to run it through ECM and ImageDiff
but for FreeArc only ECM (and then join files)
but still decompression should be notably longer for FreeArc since it's 5.5gb vs 700mb
so what's lost on ImageDiff should be regained on decompression

183

(43 replies, posted in General discussion)

creating Diffs and restoring original game from those takes a lot of time and user effort, even more than compressing/decompressing

it would be strange, if so
- when decoding with ImageDiff it would only insert certain pieces of data - it should be really fast
there would be only one ECM file (Reed-Solomon calculation) and decompression algorithm (most complex)
would need to process far less data
- when extracting whole set from single archive ther's ECM for every file and decompression algorithm would process more data

imho it doesn't really matter what compression speed is as long as it's sane, since it's done only once

and when it comes to audio... imho Ape is a good format, it is fast, well known and supported by many other software, yes you can get better ratio with TAK... but so what ? you can get even better (+ faster) with LA

well from what i read, it's anything but fast and isn't that much supported either
FLAC is faster (about 2..4 times) and more widely used but compression is slightly worse (2..3%)

compression:
http://flac.sourceforge.net/comparison_all_ratio.html
http://synthetic-soul.co.uk/comparison/lossless/
hardware/software support:
http://wiki.hydrogenaudio.org/index.php … comparison
http://flac.sourceforge.net/comparison.html
http://en.wikipedia.org/wiki/Comparison_of_audio_codecs

TAK is experimental - true, but it offers compression of APE at the speed of FLAC

LA is very-very slow

184

(43 replies, posted in General discussion)

Doesn't 7zip have a greater compression over RAR

rar would compress audio better, so it makes sense if all tracks go to single archive.

i honestly don't think that ability of torrentzip to produce identical archives is either something to worry about.
how many people use that really? 20 maybe, maybe less.
PackIso isn't bad, but surely if better compression ratio/speed combination can be found, why not?
(for instance: replacing ape with tak would improve it a lot)
it's really what matters - fast download/decompression, not ability to join in and seed - pretty much useless, imho.
what really happens is: somebody uploads a file and rest of the people fetch it and seed, that's all.
nobody cares about torrentzip.
so would somebody recompress everything - it wouldn't be a big deal, imho.

185

(43 replies, posted in General discussion)

3) Now he shares the pars with other testers to see how many blocks are needed.

about equal to archive size, i guess, which would be huge

186

(43 replies, posted in General discussion)

- no need for storing by ImageDiff when merging with repetition filter is much more conveniant
- with splitting you can achive amazing results with cost of convenience...

dunno, to me ImageDiff results look most appealing
it would depend on decompression speed difference, which one of two i'd consider the best

splitting in parts adds another layer the same as ImageDiff and both speed and compression is worse
otherwise it's just too slow, imho.

when i said years i mean single game from such merged 7z+ecm+flac (or tak)+ImageDiff set
should decompress in about 2-3 minutes or so
it's most frequent scenario, nobody really needs whole set decompressed at once,
and still ImageDiff would be faster than 10 minutes, i guess.
so, very roughly:
let it be 20 minutes overhead on game
let it be 100 such games (or decompressions user commits)
and they're shared by 1000 people
100/3 = 33 hours = 1.4 days
1.4 * 1000 = years
sure they would download faster, but download is done passively in background
when in contrast decompression takes full PC load and needs interaction from user

187

(43 replies, posted in General discussion)

- 375mb with ECM+FreeArc/LZMA in ~30 minutes
- 354mb with ECM+FreeArc/NanoZip in ~40 minutes

so it's 30-40 minutes to decompress those?

it's still too much imho
maybe for archiving, when kept to oneself - accessed infrequently and space is really an issue
but otherwise (if such archives would be distributed)
those minutes multiplied with thousands of CDs and thousands of copies would turn into years

188

(43 replies, posted in General discussion)

- 358mb with 7-Zip (one ecm'ed version & ImageDiffs)
- 344mb with NanoZip (one ecm'ed version & ImageDiffs)

it (NanoZip) would decompress slower than 7z, though
right?

189

(50 replies, posted in General discussion)

well, i went through Saturn BISO CD boot sequence with disassembler
not all the way up until AIP launch, but i'm almost there
and it doesn't look like it cares at all about those keywords
not about 'Maker ID' nor 'Product Number'
it does check however 'Hardware Identifier', 'IP Size', 'Security Code',
'Compatible area symbols' and 'Area Code Group'

so i'd guess it's probably in CD unit itself
somewhere near protection - warm and cozy

but i'm not to trust on this
if somebody is interested in subject - it's better to check for yourself

190

(50 replies, posted in General discussion)

i wrote summary in 1st post
http://forum.redump.org/post/10732/#p10732

maybe somebody feels like experimenting with 90 or 99 min CDs?
in a way, depending on which characteristics are evaluated,
they would replicate protection ring with less distortions than 74 or 80 min ones
e.g. 74min CD would gain ~52645 channel bits per 1460 revolutions, while 99min only ~39884 (25% less)
though i don't think Saturn would read 99 at all, maybe 90

i haven't tried any of those myself since there aren't such RWs

hi deksar

deksar wrote:

The question is why it saves in .ISO extension. (Track 01.iso)

it's only an extension with no further differences associated
it's configurable under Options->Image File->ISO / BIN / TAO

deksar wrote:

The Redump database lists the games as "Track 01.bin". Should i rename it from Track 01.iso to Track 01.bin?

it isn't necessary - those are generic names generated by server after parsing submitted logs

deksar wrote:

An another question: How can i create a .CUE file for that image file and for the other images i will dump?

for single track PSX games you could use generic cue sheet as it's always the same:

FILE "Track 01.bin" BINARY
  TRACK 01 MODE2/2352
    INDEX 01 00:00:00

for multitrack - it's .cue from EAC or PerfectRip

192

(50 replies, posted in General discussion)

there appears to be another value: '*EGAVIEWER' for Photo CDs
though maybe just syntax is alike

edit:
identifier '*EGASYSTEM' is also present in DC System Disc replacing CRC16 & Device Information

edit:
boot process roughly:
- check for 1st instance of '*EGA *EGASATURN ' identifier at user data offset 0 in first 15 sectors
- if present check ring
- if pass carry on with Security Code, Area Code verification
...

minimum to disable further ring checks (like SD does) appears to be pass on 1st two checks
and additionaly properly filled Maker ID & Product Number fields
all the rest can be blank dat - 0x00

also i've tried to replace identifiers with various strings that would yield same CRCs for most common models
and few other things and they wouldn't pass so i'd guess they're actually byte compared for exact match

193

(50 replies, posted in General discussion)

only images from UG.
they work but since ring can't be reproduced on CD-R as of yet, you'd need to do swap once - for the 1st time.
i have Japanese console though - i don't know about the others, maybe there is an difference.

194

(50 replies, posted in General discussion)

on system discs KD01 & KD02

so what matters is valid System ID record in System Area and particularly -
special identifier in it's Product Number field: *EGASYSTEM
everything else - does not.
it can be any track layout, any content - as long as it's valid for Saturn it will function.

Product Number field is where serial usually goes,
so it's possible to change it with hex editor to value mentioned above for any Saturn CD
and it will disable repeated ring checks after 1st validation - just like system disc does - even after reboot.

difference between KD01 and KD02 lies in Maker ID field
it's '*EGA ENTERPRISES' in 1st case - 1st party and '*EGA TP' in 2nd - 3rd party
this is not different from all normal CDs and this value is kept somewhere in memory after 1st validation,
so would you make system disc out of CD with serial T-????? (3rd party) it would function as KD02
system disc made from GS-????? CDs (1st party) would work like KD01
(except that game will still boot - actual system disc does nothing after execution of AIP - cease to function)
there appears to be no further distinction between manufacturers based on company code for 3rd party CDs
so one should work for all.

after System ID record ther's Security Code. it's like always and also Area Code group is unmodified.

and then ther's Application Initial Program for about 2 sectors, i'm not sure what it does
well it displays message on screen and appears to modify fonts, i think, but that's all i could figure
maybe it sets up some additional functions for system disc,
but it appears to play no role in boot process modification whatsoever.

so this special value for Product Number field is interesting.
maybe ther's others? maybe ther's one, that would bypass ring completely?
i'd guess it isn't stored directly for comparation but rather something like checksum would be used.
(as an example: Dreamcast has CRC16 calculated on Product Number+Version stored as an field of System ID)
so if one could figure where this validation takes place,
it should be possible then to determine whether there are any more such magic numbers.

* = S

195

(5 replies, posted in General discussion)

it could be used for such purposes indeed, but then redump.org database isn't quite fit -
it's still very incomplete and there are no relations between releases from different regions kept.
so if it isn't an aim to have CRCs, it would be far better to parse SonyIndex (no regional relations also, afaik),
VGRebirth (best option, imho) or GameFAQs (quite messed up)

about regions and languages: i think those values are precompiled in executable,
but search options would be filled with those actually used in .dat
so they only appear to be different depending on system

ability to redefine languages in .dat would be interesting and universal in a sense,
but i don't think it is the case, unfortunately.

196

(5 replies, posted in General discussion)

it's great program, but unfortunately it doesn't look too compatible with redump.org
it looks like it only support single file per game and also languages & regions lack some values.
i haven't checked thoroughly though, maybe i'm wrong

197

(17 replies, posted in News)

i'd wait until TAK implements RAW audio
and redump.org moves to no-intro naming convention
before uploading anything

198

(2 replies, posted in General discussion)

ther's more to it:
when i ran into this, either Plextor or Lite-On were connected through onboard IDE controller, along with primary HDD.
i've changed IDE drives a few times and it would affects this issue to a certain degree,
so i was pretty certain it's something with controller.
now i have Plextor hooked via USB converter and it still stays.
what's more interesting: READ CD command returns modified sectors but READ BUFFER right afterwards does not.
so in Plextor's buffer data is correct but somewhere on the way out it gets crooked.

199

(22 replies, posted in General discussion)

how do they relate to tracks?
maybe they mark where audio cuts off and silence starts?
i.e. when you move gaps to the next track is the last sector of previous track the last one with samples?

then i second that.
indeed ring code would bee needed at least to keep duplicates out.
it wouldn't help for my audio CDs at least at first tho - they have completely different IFPIs or haven't got them at all,
but as it gets larger it would become more and more useful.

edit:
btw her's an neat excel db with Mould codes -> Countries -> Manufacturers
http://www.redbrick.dcu.ie/~griffin/ifpi.html
we could use a part of that information