Batou wrote:

When my finances get better, I could go after the two Cottons, Tron ni Kobun and Silver Jiken. Likewise, if the games on this list are relevant to redump.org, I could give it a shot  smile

If you care, the most expensive imported Japanese games (which are cheap enough in Japan) are shmups, the most expensive PSX ones are Kyuin [SLPS-00214], Harmful Park [SLPS-00498] and Gaia Seed - Project Seed Trap [SLPS-00624].

1,427

(4 replies, posted in General discussion)

jfromeo wrote:

I guess it varies largely from one dump to another, but, in average, how much can Pakkiso + 7z compress a full and clean dump in %?

Incorrect question, IMO. You should ask about the difference in % between zip/rar compression and pakkiso+7z.

Anyone with Japanese 'flashes'? There are only 33 of them (at least) wink

1,429

(2 replies, posted in General discussion)

Any class, but with mods' approval.

1,430

(11 replies, posted in General discussion)

And does EAC recognize the overread on this drive properly? Does it _really_ work? I have a TEAC's drive with both overreads, PerfectRip works fine, but EAC acts like there's no overread (many zero samples instead of the actual data at the end of the very last audio track, 2-3-4 rows of errors and a "seek error").

1,431

(11 replies, posted in General discussion)

Maybe your drive doesn't support overreading into lead-out and the combined offset for this disc is positive? If so, it's normal - that's why overreading is a must for a good dump, you won't be able to dump this disc with this drive.

You provide too less details about the performed commands. What was the subs reading mode? 001? 010? 100?

Also, you won't be ever able to get the proper subchannels dump this way - the subchannels dumping tool should be standalone, because taking a complete dump takes hours (and you should get at least 2x confirmations on at least 2 different drives to get a somewhat reliable result).

1,433

(5 replies, posted in General discussion)

Yes, and I've told him that I won't answer on the native language anymore.

1,434

(5 replies, posted in General discussion)

Видеодиски добавляются только как часть какого-то сета. Если Dino Crisis 5th Anniversary у тебя целиком, то дампишь все три диска (два игровых и один видео), добавляешь все три, Edition: Dino Crisis 5th Anniversary, у видеодиска после названия пишешь (Bonus DVD). Промо пока некуда добавлять. Лучше дампить IsoBuster'ом с установленным AnyDVD (AnyDVD отключит защиту и бастер снимет полный образ). А вообще, на русском тут не общаются, юзай хотя бы переводчик.

Companies aren't so stupid, if there's a checksum - there are releases somewhere, that's why MAME follows the "3 years" rule and excludes single games by request. Hosting .torrent files is also legal (theoretically), but many trackers are being constantly sued (even MiniNova, which isn't even a tracker). I don't see a reason to be so brave, we should have some caution to prevent the possible unpleasant consequences.

Securom ones should be apparently dumped in 2064 mode, no tool yet. StarForced ones can be dumped with anything, but they won't work smile Topology can't be really called the "data", so we don't plan to dump them. Also, I don't like an idea to dump the very recent titles (3 years old and newer) - this is piracy, not preservation (recent titles aren't going to extinct or something).

Of course, there are random errors in subchannels, there are random errors already pressed on the disc and random errors when reading them. That's why fixing the subchannels isn't a perfect idea in terms of preservation.

About error correction - still waiting you on the IRC channel (or ICQ) to discuss the details and algorithms. IMO, discs should be dumped in scrambled form via D8, then descrambled via software (to bypass any firmware error correction). No C2 errors - good dump.

Probably this game was burned and dumped (with garbage) on a mastering stage, then burned back. The last 90 bytes of garbage should match the pre-last 90 bytes of garbage. If so, combined offset is 1206 and the first audiotrack does really contain 90 bytes of garbage as a part of the gap.

ssjkakaroto wrote:

Feltzkrone may I suggest that you go for Java command-line because in the future it would be easier to port the program to other OSs.

Unless he's interested to modify PerfectRip (which is written on Delphi) instead of writing everything from scratch.

Yes, ECM won't touch any unusual/corrupted/whatever sectors, audio sectors, etc. Haven't checked its source, but I'm sure it reads 2048 bytes from each sector, then generates ECC/EDC fields, then compares with the ones in the sector - if matched, it cuts them out and leaves a mark that they should be regenerated during the UNECM routine. Anyway, it's a safe tool.

Feltzkrone, try to join our IRC channel and contact me there.

Feltzkrone wrote:

If this really was the case, why don't we already have one? wink

Noone to code. I don't have enough time and no volunteers at all.

Feltzkrone wrote:

I only wanted to provide a tool which would detect special cases of pregap layouts and simplify the task of determining the combined offset. What's bad about it? Just that we still don't have an all-in-one tool then?

You offer to write a tool, which should be able to read some sectors, read the subs, correct the subs, analyze the subs, count the data offset, count the subs offset - man, you just need to read all the sectors instead of some gap and split the tracks - voila, a good dump. I only wanted to provide -- you're welcome to provide any tool smile

Feltzkrone wrote:

Don't get me wrong, but if such a tool is really wanted you are one of those few people who will have to contribute their knowledge. Coding (half-)blindly might produce a usable all-in-one dumping tool but it won't be perfect then and will cause bad dumps, that's why knowledge is needed to be spread.

Of course, I've already offered this (my ideas/algorithms, me as a tester, but someone else as a coder), noone is interested.

Feltzkrone wrote:

For example by posting background info on how certain things found on a CD should be detected and interpreted, how errors in audio extraction could be detected and compensated (i.e. figuring out what is so 'magical' about EAC).

Feltzkrone wrote:

If you don't mind I'd like to alter the topic subject for something like "Technical discussion for a future dumping tool" and - as the new subject denotes - we could talk about technical backgrounds, tricks and abnormalities here - that the tool should be able to handle properly.

Noone is interested.

Feltzkrone wrote:

Also we would have to discuss if the tool still should rely on CUE/BIN as with this format mixed-mode pregaps cannot be preserved properly. Subchannel data would have to be preserved aswell where it is non-standard. Apart from that I wouldn't mind discarding Sync and ECC/EDC info in data sectors where they are built the standard way - finally giving an image of the CD which contains user data and abnormalities, from which (if needed) a full and clean CCD/IMG/SUB aswell as a clean CUE/BIN can be reconstructed. What do you think about that?

Yes, with the .sub dumps it's possible to add the checksum for a single .img file, ccd can be generated from the .sub file (in 99% cases, at least - to cover the 100% we should also dump TOC into a standalone file and generate ccd and cue based on both TOC and .sub dumps, not only .sub). This would make many stupid people happy, who think, that our splitted dumps are bad and clonecd ones are perfect.

Offset is only for audio, I repeat, any drive fixes it automatically for data tracks. Theoretically, you can find the offset value, but the only thing you can do with it - is to add it into the entry, you can't use it in the dumping process.

D8 or swapping with audio cd. It's not critical, though, just if you don't like this field empty in you entry (drives fix the offset for data track automatically).

Feltzkrone wrote:

What's far more annoying is that with the current dumping guide for CD-based games about a quarter of Mixed-Mode CDs which I tried the guide simply is not sufficient.

There are many more "special" cases, actually - corrupted scrambled data sectors (different drives give different descrambled results), audio CDs, multiple postgap indexes, special flags (like DCP), multisession CDs (Jaguar CD games, Dreamcast MIL-CDs, some PC games); also many ongoing problems - Neo Geo CDs protection (subs), CD-i Ready titles, etc, etc, etc...

Writing a basic proper dumping tool won't take much longer time, so don't see much point in this.

First of all, no need to use the bold symbols - I've marked some words from my post to notify all the newbies, that your guide isn't perfect and it's not that easy.

Feltzkrone wrote:

And what if those data sectors are marked as audio? (All following questions refer to audio-marked data sectors...)
1) Are they still in scrambled form on CD or might that depend on the mastering?

Sure. Unscrambled sectors are very rare, but it happens sometimes (see [SS] Sakura Tsuushin entry, for example)

Feltzkrone wrote:

2) Does any drive unscramble them automatically if they are scrambled (= drive ignores that they are marked as audio)?

When you dump a first data track, data sectors from pregap on its end will be descrambled (same for CloneCD dumps, etc.). Not sure what happens when you extract them as audio, though. EAC tweaks the first gap, excluding those sectors. PR extracts them totally wrong (not scrambled, not unscrambled, but screwed).

Feltzkrone wrote:

3) If they are not automatically unscrambled, does the factory write offset apply when reading them with READ CD commands, i.e. data is shifted when read, i.e. sync marks are not at the beginning of the returned sector data?

Usual READ CD command should return them unscrambled, because they "belong" to the previous (data) track, according to the drive's firmware's logic, I've already explained this. Next track "officially" starts from the 01 index according to the TOC, 00 index belongs to the same track according to the subs, but following the TOC ignores this.

Feltzkrone wrote:

4) How should the data be kept in the image? Scrambled or unscrambled, or depending on certain circumstances?

I repeat: in my opinion, all the sectors should be scrambled (even the data tracks), because that's how they are stored on CD. But in the current situation data sectors marked as audio should be scrambled, data sectors marked as data should be unscrambled. But in any case there should be a proper comment in the dump's entry describing all the abnormalities.

Feltzkrone wrote:

When you are saying that subchannel data analyzing is necessary in both cases, isn't it that subchannel analysis is first necessary to distinguish both cases from each other and after that (again) necessary to figure out the number of sectors that actually are marked as data?

Yes, you should find the proper gaps in subs at first, then you should check the mode for the gap sectors (audio/data), then you should count a number of data sectors in the gap. Btw, some of the sectors of the gap may be marked as data in the subs and some - as audio (don't have any examples yet, but I can't exclude a possibility of this).

Feltzkrone wrote:

How about a tool that just automates pregap (including its subchannel data) analyzing and prints the results - similar to Px_D8 which just prints the combined offset?

What results, exactly?

Feltzkrone wrote:

EDIT: If I may ask in all innocence - are F1ReB4LL and Rocknroms the only ones willing to discuss and clarify more complex cases like these or are other members, moderators and admins just very busy at the moment? (Don't get me wrong - no offense!)

Jackal is also able to do some researches, Dremora in some rare cases... Themabus is more on the hardware side (Saturn rings tests, etc.). Don't remember anyone alse.

I repeat, there are CDs, where those sectors are marked as data in the subs (track is audio, sectors are data AND marked as data in the subs) - I insist on leaving them descrambled and this case suggests different handling (in this case you should extract the data part of the gap from clonecd dump skipping the 1st track - first 176400 bytes, first 352800 bytes or first 529200 bytes, etc., because data sectors rarely fills the _whole_ gap, then, you should extract the audio part of the gap and the track itself, skipping [1st_track_size + combined_offset_size_in_bytes + data_part_of_the_gap_size], then glue the both parts). Of course, subchannels analyzing is necessary in both cases.

1,448

(27 replies, posted in General discussion)

Rocknroms wrote:

And I repeat those sector are not garbage, garbage is something else: wrong offset detection or bytes added by firmware or program, bytes not present on CD.

I've never said those sectors are garbage, I've said that a data track with a glued audio gap on the end has garbage between the descrambled data and audio sectors and even if it were possible to disable descrambling for audio gap when there are data sectors, that gap would be incomplete due to that garbage.

1,449

(27 replies, posted in General discussion)

Rocknroms wrote:

This http://redump.org/disc/8047/ is the only exception, the other one is the same of my examples, that is not garbage unless it has wrong offset.

Wrong, you can't dump WWF by reading the descrambled sectors and rescrambling them - this won't give you the proper image.

Rocknroms wrote:

About the other point you don't fix anything, simply speed up the process importing the same sectors from an empty image without redumping it with cdtoimg or trap disc. There's nothing to fix or modify, disc has always the same structure, if it's mode1 for example, all empty sectors will have the same header at same position for any disc with mode1 form unless toc is fake.

Please, don't ever generate any data, every byte should be read from CD, over. Your method is only good for converting the random dumps to match our dats, but if you're gonna add such semi-generated dumps into db - I'll kill you. You can't say the sector on the particular CD is good and doesn't have any mastering errors without reading it - you can only assume that and assuming is always bad, especially when you claim that the dump comes from the actual CD.

1,450

(27 replies, posted in General discussion)

Rocknroms wrote:

a) Plextor, cdtoimg and chopfile or b) Swap trick, CD tool and chopfile

Don't you have to rescrambling something in both situations? Or not? don't you have to use descramble_CDDA or something similar?

Nope, no need to use. cdtoimg or swap trick + cd tool give the scrambled sectors, you just dump them and that's all. And I mean cases like http://redump.org/disc/7077/ or http://redump.org/disc/8047/ which contain abnormal scrambled data sectors, which sometimes give correct descrambled ones (by a drive's firmware on some drives). Rescrambling them back won't give you proper sectors. That's why I say, that in any case of incorrect mastering you should dump the scrambled sectors properly, because they can be abnormal.

Rocknroms wrote:

And I repeat again descrambled data can be wrong ok, if so you can use sector from an empty mode1/2 track to fix it (or simply use this track to create scrambled sectors).

We don't fix anything, we preserve the data "as is". Or am I misunderstanding you?