26

(14 replies, posted in General discussion)

F1ReB4LL wrote:

That's an incorrect term. There's only 1 offset per disc, while you're talking about leftovers from earlier burning/dumping mastering stages. Those leftovers are physically present on the disc and need to be kept, since they belong to the disc data.

It does look to me like mastering issue, I don't know the nature of it, but the reality is that throughout the disc data "shifts". I think "shift" term is correct to use as it shifts data in relation to to previous position. Also, there are some data leftovers, but that's incomplete sector. If you analyze these transitional sectors, you will see that it gradually screws up pairs of bytes in sample aligned to sample size.
Propose better term smile, we have to call it something.

F1ReB4LL wrote:

All 3 are wrong, since those leftovers often contain duped data and a plain descrambling will lead to duped sectors. Also, there are quite a lot of cases when those leftovers appear as a part of a first post-data audio track, like http://redump.org/disc/7986/

I don't think we are on the same page here. There are no duped sectors and it's not random garbage. They are real damaged sectors with reasonable LBA. Do you want to see some bytes? I believe I can demonstrate.

F1ReB4LL wrote:

The main dump should be always left as is, with anything unusual left scrambled.

This is (1), you're talking about, this is exactly what you want. But I am saying that this is not what DIC does today and sectors from that damaged portion are half scrambled half descrambled.

27

(14 replies, posted in General discussion)

user7 wrote:

Which option produces the most functional dump?

Probably any option, transitional sectors are usually in the end of the track and most likely originally zeroed and unused.

EDIT: to clarify, the most functional one would be any out of these 3 proposed options but with "--correct-offset-shift" option on - that's probably what you wanted to hear! smile

28

(14 replies, posted in General discussion)

As some of you are already aware, some CD's have a mastering issue where write offset changes across the disc. For the standardization purpose, I will be calling that "offset shift".
Historically we knew of a couple examples such as:
Philips Media Presents: CD-i Games: Highlights-Previews-Gameclips: http://redump.org/disc/74810/
The Cranberries: Doors and Windows: http://redump.org/disc/99290/
CD-I CD Interactive: http://redump.org/disc/97023/

Working on redumper lead-in/lead-out dumping functionality for the data discs I noticed that offset is actually shifting multiple times starting from the lead-in and even propagates to the lead-out. Analyzing this for discs with data track is possible due to the fact that if the first disc track is a data track, lead-in is usually scrambled empty sectors, and, respectively, if last disc track is a data track, the following lead-out track is also scrambled empty sectors.

Example for http://redump.org/disc/74810:
TOC:

track 1 {  data }
  index 00 { LBA: [  -150 ..     -1], length:    150, MSF: 00:00:00-00:31:01 }
  index 01 { LBA: [     0 ..   2176], length:   2177, MSF: 00:02:00-00:31:01 }
track 2 {  data }
  index 00 { LBA: [  2177 ..   2324], length:    148, MSF: 00:31:02-21:55:39 }
  index 01 { LBA: [  2325 ..  98514], length:  96190, MSF: 00:33:00-21:55:39 }
track A {  data }
  index 01 { LBA: [ 98515 ..  98612], length:     98, MSF: 21:55:40-21:56:62 }

offsets:

LBA: [ -2259 ..   -150], offset: -294, count: 2110
LBA: [  -149 ..   2175], offset: -613, count: 2325
LBA: [  2176 ..  98514], offset: -609, count: 96339
LBA: [ 98515 ..  98614], offset: -882, count: 100

As you can see from this example, first offset shift happens between lead-in and pre-gap and others are "between" tracks although a little bit imprecise. As lead-out internally is just another track, it propagates there too.

Digging deeper I uncovered that there is many more of such offset shifting discs and most, if not all PC data discs where couple of the last data track sectors are "corrupted" (descramble failed) are actually offset shifting discs. As redumper outputs detailed descramble statistics, I was contacted numerous times by different people including our mods to check a particular data dump log to make sure it is correct and analyzing these cases I realized it's the same offset shifting issue.

Why this is important?
Every offset shifting transition goes across multiple sectors gradually and due to some peculiar mastering detail that we don't know yet, these sectors are randomly corrupted. Such corruption makes it difficult for the dumping software to decide on what to do with such sectors and whether to attempt to descramble it.
As my recent findings hint that there are a lot of such discs, the purpose of this topic is to standardize how do we preserve such transitions so it follows redump.org preservation standards and is uniform across dumping software (which is basically, DIC and redumper lol).

As of today, redumper dumps such discs with one global disc write offset which is detected based on the first sector of the first data track (simplified). This is the default behaviour.
In addition to that, in redumper I provide an option "--correct-offset-shift", which follows offset shift changes, and such a dump can be added to redump.org as (Fixed Dump) edition. Regardless of using this option or not, we need to standardize our handling of such transitions.

Here's how that can be handled:
1. Leave transitional sectors intact.
2. Force descramble of all transitional sectors
3. Intelligently detect if the sector is scrambled based on a combination of content criteria and if it is, try to descramble it

Right now, both DIC and redumper are doing a variation of (3). More often than not, this descrambles some sectors and leaves other sectors intact e.g. you get a mix of god knows what and there is no way to recover scrambled content that is 1:1 with the original. In addition to that, redumper does it differently and that allows to descramble "better", but this is not the point here. The point is that (3) doesn't yield consistent results and these results aren't 1:1 aligned with the source (scrambled) material.

On the other hand (2) is the sweet spot as it is consistent and primary scrambled image can be reconstructed 1:1.

Finally, (1) is a compromise where we lose 1:1 but keep some sort of consistency.

I would like to hear opinions on this. Just please, let's keep on topic, I don't want the conversation to go elsewhere.

29

(3 replies, posted in News)

Lugamo wrote:

Do you know if MPF will add Redumper support in the future?

Yes, it's currently being built.

30

(3,534 replies, posted in General discussion)

sarami wrote:
superg wrote:

it belongs to the current track, not to the next one.

No. See other tracks.
It's not always but P channel of 1st sector of the track and 2nd sector of the track is different typically.

I understand your point that P change is not always an indicator of a track change, no objections here. However this is supplemental thing here.

Main problem is that Mode 2 Q (MCN) @ LBA 47437 separates track 15 and track 16. Here are the reasons why MCN should always belong to the previous track:

1. When playing the music, ordinary audio CD player plays it from left to right, MCN/ISRC frames repeat each 1/100 frames and are informational. Current track number is stored as a current state of audio player and it switches only when it changes in in Mode 1 Q which will happen after the MCN/ISRC as current track information is not stored in Mode 2 or Mode 3 Q.

2. New track always starts from Mode 1 Q. I think I saw that somewhere in rainbow books, unfortunately I cannot find it right now. Although this statement is implied throughout documentation, for instance open MMC-3 working draft page 26:
"4.2.3.4.3 ADR=3 (0011b) – Mode-3 Q"
"The ISRC may only change immediately after the Track Number (TNO) has been changed."

I understand you're trying to do some guesswork based on 150 standard gap size or even establishing that P is shifted 1 sector for each track so start is always shifted in relation to that or something but that is unsafe.

31

(3,534 replies, posted in General discussion)

sarami wrote:
bikerspade wrote:

A relatively recent dump of a particular PCECD disc (Metamor Jupiter) produced a bad track split, where the hashes for track 15 and track 16 are incorrect.

DB is incorrect. The pregap of the track 16 is 00:03:00, not 00:02:74.

LBA[047436, 0x0b94c]: P[00], Q[01150100386600103436d96b]{Audio, 2ch, Copy NG, Pre-emphasis No, Track[15], Idx[01], RMSF[00:38:66], AMSF[10:34:36]}, RtoW[0, 0, 0, 0]
LBA[047437, 0x0b94d]: P[00], Q[0200000000000000003767c1]{Audio, 2ch, Copy NG, Pre-emphasis No, MediaCatalogNumber [0000000000000], AMSF[     :37]}, RtoW[0, 0, 0, 0]
LBA[047438, 0x0b94e]: P[ff], Q[01160000027300103438dcb1]{Audio, 2ch, Copy NG, Pre-emphasis No, Track[16], Idx[00], RMSF[00:02:73], AMSF[10:34:38]}, RtoW[0, 0, 0, 0]

LBA 47437 belongs to the track 16.

sarami, LBA 047437 belongs to track 15. First of all, standard says that if track or index ends with Q without positional information it belongs to the current track, not to the next one. I'm too lazy to search through it but trust me.
Next, check P values from your subchannel decode, P[00] for LBA 047437 shows that it belongs to the previous track.
The dump in DB is correct.

32

(20 replies, posted in General discussion)

Deterous wrote:

I've attached a zip of a list of serials for each BIOS. It would be useful for knowing which consoles supports which game in terms of backwards compatibility, etc.

Is the order of serials untouched in these files? I wonder if we can deduce some information based on the serials initial ordering e.g. if these were grouped based on some criteria in the firmware.

33

(20 replies, posted in General discussion)

SLES-01227
SLES-11227

It can be Command & Conquer: Red Alert or
Command & Conquer: Red Alert: Retaliation

The only 2CD game regional releases of which were scattered around the same time frame.
Reference: http://redump.org/discs/system/psx/sort/serial/?page=7

34

(20 replies, posted in General discussion)

Jackal wrote:

00977 number is claimed by the bonus disc, so it seems very unlikely that an unknown RE2 release exists with the same number.

It's marked as such in emulator databases, but doubt it's accurate.
Also second disc serial follows this exact pattern from the other regions.

35

(20 replies, posted in General discussion)

More research:

Resident Evil 2 (English)
SLES-00972
SLES-10972

Resident Evil 2 (French)
SLES-00973
SLES-10973

Resident Evil 2 (German)
SLES-00974
SLES-10974

Resident Evil 2 (Italian)
SLES-00975
SLES-10975

Resident Evil 2 (Spanish)
SLES-00976
SLES-10976

Resident Evil 2 (XXX)
SLES-00977
SLES-10977

Looks like we miss Resident Evil 2 dump of unknown European country.

36

(20 replies, posted in General discussion)

All right, I finally spent some time for this, and identified some

PSX Europe:
SLES-00977 - Resident Evil - Director's Cut - Bonus Disc (https://github.com/libretro/libretro-da … /ps1.idlst)
SLES-10977 - 2CD???

SLES-01227 - 2CD???
SLES-11227 - 2CD???

SLES-01894 - ???

PSX Japan:
SLPM-80639 - 2大ヒーロー スペシャルDisc 体験版 [2 Big Hero Special Disc Trial Version] (https://w.atwiki.jp/psemu/pages/124.html)
SLPS-00653

37

(20 replies, posted in General discussion)

Deterous wrote:

I've trawled through all known PS2 BIOS's and found a total of 784 mentioned serials, approximately 567 of which are PS1 and the remainder PS2. A few of the serials are not dumped and have unknown titles.

This is nice, thanks for doing that!
I'll try to check what's there a little bit later.

38

(20 replies, posted in General discussion)

Deterous wrote:

Way fewer (23 vs 198) serials listed in the PAL SCPH-9000x BIOS, at least that I could find.

I think that's because eventually they fixed some games playability and removed them from the compatibility list.
It's good to know that earlier bioses contain more titles though, if you have a better automatic way of extracting this, I'd suggest to iterate over entire PS2 BIOS catalog and merging the lists, it could be that some titles were added and then deleted, not 100% sure though.

39

(17 replies, posted in General discussion)

I think I already have your spreadsheet Into. Unless you added a lot since. Post the link here anyways.

40

(17 replies, posted in General discussion)

I updated the first post.

41

(17 replies, posted in General discussion)

Jackal wrote:

0. If there is no non zero data in pregap/lead-out, use 0 offset. Unless it's possible to manually detect the write offset with a reasonable degree of certainty, in which case combined offset correction can be used.

1. if there is non zero data in lead-out and that data can be fully shifted out of there (left) without spanning non zero data into pre-gap, correct offset with a minimum shift required

2. if there is non zero data in pre-gap and that data can be fully shifted out of there (right) without spanning non zero data into lead-out, correct offset with a minimum shift required

This is clear.


Jackal wrote:

Whenever a disc is dumped with offset correction, this should be documented in comments.

The non-zero offset will be specified in the ringcode entry, wouldn't that be enough?


Jackal wrote:

And then for the rare headache cases discussed in your last post where it's impossible to shift out data from lead-out/pre-gap (data is wider than allocated TOC space for it):

3. Use 0 offset and preserve relevant non-zero data in separate pregap.bin or leadout.bin. I don't see any advantage in trying to include this data with the main dump through a custom cuesheet format or whatever, but if it's decided otherwise, that's fine by me.

Yes, now I think this would be the best course of action. Separate files, size is sector aligned.


Jackal wrote:

And for the DC / PSX or other discs that have missing relevant TOC / pre-gap / lead-out data, we should also preserve this data in separate files (offset corrected if possible).

I already have this implemented in redumper, just have to walk over it and do some checks.


Jackal wrote:

As for offset matching and "universal" checksums: Audio checksum databases like AccurateRip and CUETools are already ignoring leading and trailing zero bytes, so they are essentially already storing "universal" checksums? I think this is beyond the scope of the Redump project and would require too much work and too many changes.

AccurateRip and CUETools are track based and they do it mainly to match tracks as far as I know - this is overkill for us.
What I was saying is not exactly that. In redumper, only for audio cd's I can generate let's say a SHA-1 hash of a non-zero data span, one hash per disc. That would be in the log-file. For a new submission of an audio disc, we add that to comments, example:
Universal Hash: 958b5a41456c90cb5c6df8d676c3d2d90a940609 (-647)
For the subsequent verifications of the same disc with a different write offset these hashes will match and this will be an indicator to us not to add a new disc but add another ringcode line in an existing entry. Just don't tell me we have too many things in comments (we do) but out of all stored and unneeded crap like volume labels, this particular thing would be the most useful.


Jackal wrote:

Guess we still need to figure out how add the separate files in the database, with iR0b0t not around. Maybe resort to storing .dat or checksums in comments for now, similar to Xbox PFI/DMI/SS.

By the way, can we add extra files to XML list but exclude them from CUE-sheet, would that work?

42

(17 replies, posted in General discussion)

Ok, here's another update based on the discs I purchased and dumped myself. Initially that was asked by Fireball as he has experience with them but this is really good shitty audio usecases and some general info on what we can encounter.

Dracula: Music Collection
http://redump.org/disc/14890/

Nothing special about this disc other than two masterings differ by the offset, same happens here: http://redump.org/disc/77301/
Specified possible write offsets +390 and +684 cut into tracks, I don't think they are relevant. True offset should be in perfect range [-4926 .. -3731] based on my redumper algo.

We can offset match such discs using two possible approaches:
1. Always shift each dump left-most or right-most so regardless of the offset each dump will match each other. Pros are that it's totally automatic. But a big con is that we will need to redump all audio entries which is unrealistic.
2. By introducing something I'd call a "universal checksum". Basically upon successfull dump, redumper calculates right-most (or left-most) data shift checksums and outputs crc/md5/sha-1 checksum the usual way: <rom offset="+123" size="68701920" crc="060bb712" md5="47393f188ff00fafbdf77b5eb771dbd3" sha1="ef991d90b284b0c92ab2b4eb0eb77942e32bb98c" /> and notes the offset value needed to right-most/left-most shift. We store this information somewhere for the future reference. Every time a potential different offset verification title is dumped, we compare universal checksums and if they match, we add another ringcode line to the matched entry with deduced offset in relation to the previous entry.
The pros of such approach is that we don't dramatically change the way we dump comparing to method (1) so already added dumps stays the way they are. The cons are that it's not 100% automatic.

Personally, I'm in for (2), this is easy to implement and we can set a precedent that will be used in the audio dumping world.


Tenbu Mega CD Special Mini Audio CD
http://redump.org/disc/6695/

This is very clean, redumper shifts out 13 samples left from lead-out and everything still fits in pre-gap nicely.


Micronet Music Collection Vol. 1
http://redump.org/disc/30335/

This has huge non-zero chunk (22006 samples or 88024 bytes) in the lead-out. According to the proposed rules, we shift the data out of there left by the amount of 22006 samples. This will get rid of the lead-out data but will spill over 16 non zero samples into pre-gap. Not ideal but it's close to the truth and perfect range for this disc is [+8423 .. +21155]. IMO the best solution given that we preserve whole data in one file.


Oyaji Hunter Mahjong
http://redump.org/disc/39873/

This is exactly as comments say. I stand corrected, this is more horrible. There is 68 sectors of data in lead-out, there is 150 sectors of data in-pregap and there is ~670 sectors (1574524 bytes) of non-zero data in TOC before pre-gap. I capture everything in redumper and it seems to be consistent in the scram file. Offset 0 is used by default and I extract leadout.bin as is and getting same checksums calculated by Fireball, everything matches. 150 sectors data in pre-gap are fully preserved in Track 1, but what to do with the data in TOC? I don't know. Well, in fact I will propose a solution later but that requires everybody to be open minded smile


Other Considerations
Now, with all these examples in mind, I have a modified idea which will let us capture every byte and be mostly redump compatible (including site and the current DB).
What if we never shift audio e.g. always use offset 0 but store spillover lead-in and lead-out data in separate tracks? Something like pregap.bin / leadout.bin that we don't currently "preserve" but in a more generalized way.
This fits in a very elegant way with lead-out as internally, leadout is just another disc track with AA track number and it has all the track properties such as mode, data, positional subchannel etc. As in reality lead-out track spans the whole disc, we trim all the zeroed data and make it sector aligned. If there is no data in the lead-out - we don't create a file and that will satisfy 99% of all the use cases. But at the same time we accomodate for the case where there is something there. As other two big benefits I see that we can preserve Dreamcast logo data which is session 1 lead-out and I sometimes see lead-out audio spillover in PSX discs where it's not currently being preserved in any way. We can have the track defined in the CUE-sheet with all the appropriate properties thus this data will be preserved by "data hungry" preservationists, whoever they are. The similar approach will go for the non zeroed lead-in track. If it's empty, like it usually is in 99% cases, it won't exist. If it does, it's zero trimmed at front and sector aligned. No data is lost ever, redump track compatibility is all time high as it's CUE tied and we add it to the website like a usual tracklist with hashes.


Oyaji Hunter Mahjong example:

FILE "Oyaji Hunter Mahjong (Japan) (3DO Game Bundle) (Track 1#00).bin" BINARY
  REM REDUMP LEADIN
  TRACK 00 AUDIO
    INDEX 00 00:00:00
FILE "Oyaji Hunter Mahjong (Japan) (3DO Game Bundle) (Track 1).bin" BINARY
  TRACK 01 AUDIO
    INDEX 01 00:00:00
FILE "Oyaji Hunter Mahjong (Japan) (3DO Game Bundle) (Track 2).bin" BINARY
  TRACK 02 AUDIO
    INDEX 00 00:00:00
    INDEX 01 00:12:45
FILE "Oyaji Hunter Mahjong (Japan) (3DO Game Bundle) (Track 3).bin" BINARY
  TRACK 03 AUDIO
    INDEX 00 00:00:00
    INDEX 01 00:09:60
FILE "Oyaji Hunter Mahjong (Japan) (3DO Game Bundle) (Track 4).bin" BINARY
  TRACK 04 AUDIO
    INDEX 00 00:00:00
    INDEX 01 00:11:63
FILE "Oyaji Hunter Mahjong (Japan) (3DO Game Bundle) (Track 4@AA).bin" BINARY
  REM REDUMP LEADOUT
  TRACK 05 AUDIO
    INDEX 01 00:00:00

Or variation naming/numbering schemes. I specifically chosen # and @ for filenames as the symbols sort before and after number entry thus you get a nice look and this scheme supports multisession pre-gaps/lead-out as we don't have to renumerate anything.
We could use simply "Track 00" for lead-in and "Track 05" for lead-out but there has to be a good way of supporting this for multisession discs where there can be session lead-out/lead-in between two tracks with adjacent numbers.
Or, we don't have to add it to the CUE-sheet at all but in my opinion having it there ties all the files together for the preservation. We could even have special redump CUE tags for that, plenty of ways.

43

(20 replies, posted in General discussion)

https://i.ibb.co/MsQGdRz/Screenshot-from-2022-11-11-18-52-31.png
From PAL 70004 BIOS, all that's available.
This is most likely excluding list of problematic games which are either disabled or patched. I checked a couple serials, there are multi disc games for instance. The list obviously not helping much.
As most is unaware, PS1 games are half-emulated because PS2 doesn't have PS1 GPU, hence much more games not included in the list have problems, as a good read I recommend PS1 emulation engineer interview: https://freelansations.medium.com/the-s … 39cf5a0353

Myria wrote:

* If a disk’s first TOC has no B0 entry, the TOC has a sufficiently small A2 entry, and the TOC’s own timestamps in sub-Q use positive encoding (00:00:00 start rather than 99:59:74 end) fake that there is a B0 entry around 08:00:00 or so.

There are usually no TOC B0 entries on 99% of the CD's. I saw them only on some multisession pressed CD's. From what I know it's used for CD-R.


Myria wrote:

This allows dumping the “PRODUCED BY SEGA ENTERPRISES” area—yes, that area is actually readable in CD audio mode.  (It uses SafeDisc 2-like weak sectors.)

Can you elaborate? I never heard of "weak sectors" before.

45

(17 replies, posted in General discussion)

Just an update on this, I ordered a couple more Japanese audio CD's which have audio in pregap/leadout which Fireball suggested to check.
They are on the way here. After I receive them and redump, I will report my findings here and we will reiterate on the final audio CD rules and preservation format as I really want this finalized.

46

(17 replies, posted in General discussion)

I want to try some CD's suggested by Fireball to see if we're covered there.

47

(17 replies, posted in General discussion)

Jackal wrote:

And where does this discussion leave us with discs like those PSX with audio in lead-out? I'm against appending lead-out data to the last track, because it's just not part of the main tracks. Also, I dont think we should shift audio data out of the lead-out for mixed mode discs, because the combined offset correction overrules it. So the only solutions for such discs imho is to put the data in leadout.bin or do nothing with it.

I totally agree with that. I purposely haven't mentioned that yet to get Fireball's opinion on pregap.bin/leadout.bin and focus on one issue at a time wink.
In all situation where we have an offset determined by a data track, we shouldn't extend last track, saving leadout separately for cases like this would be possibly a best solution?
TL;DR, shifting data out of lead-out / pre-gap should happen only for the discs where we can't figure out an offset in a deterministic way (based on data track sync/MSF or anything similar).

48

(17 replies, posted in General discussion)

Ok, so some cool down period passed, let's regroup.

Let's say, we remove perfect track split out of consideration. I'll have redumper output perfect audio offset range anyway just for reference but will not apply it by default. The concept would be super useful for perfectionist audio folks so I'm happy I have that implementation for them.

What's basically left for us @redump is to define a clear approach to how we handle audio discs with non zero pre-gap and lead-out data.

I guess we have a consensus on the next two rules:
1. if there is non zero data in lead-out and that data can be fully shifted out of there (left) without spanning non zero data into pre-gap, correct offset with a minimum shift required
2. if there is non zero data in pre-gap and that data can be fully shifted out of there (right) without spanning non zero data into lead-out, correct offset with a minimum shift required

How to define situations where it's impossible to shift out data from lead-out/pre-gap (data is wider than allocated TOC space for it), several options:
1. Use offset 0 as a base, dump non zero pre-gap data to pregap.bin, dump non zero lead-out data to leadout.bin
2. Fully shift data out of lead-out if needed, dump non zero pre-gap data to pregap.bin
3. Use offset 0 as a base, prepend non zero pre-gap data to the first track, append non zero data to the last track
4. Fully shift data out of lead-out if needed, prepend non zero pre-gap data to the first track

My insight:
I don't like (1) or (2) because preserving data to external pregap.bin and leadout.bin files will usually be "lost" because it's unreferenced from the CUE-sheet and I don't see a good way of linking the files to the other cue/bin set.
I also don't like (1) or (2) because in all cases that I saw, non zero pre-gap data genuinely belongs to the first track. It's either HTOA index 0 entry or non zeroed mastering "silence" which is still part of the first track.
Lastly, I didn't find a proof anywhere in the Red Book standard that pre-gap data of an audio disc should be zeroed. I found such a requirement only for the pre-gap of a data track in ECMA-130.

That said, I personally would lean towards (4).

Let me know what do you think.

49

(3,534 replies, posted in General discussion)

ehw wrote:

Hey sarami I have a question about your .c2 format.

According to your readme, 1 bit inside the .c2 file represents 1 byte in the disc image. But there isn't any additional information besides that. I can assume that 0 means no c2 error and 1 means c2 error, but I'm having a difficult time trying to determine how the bits correlate to the exact locations of the bytes in the disc image.

Please see my reply here: http://forum.redump.org/post/99760/#p99760

ehw wrote:

I assume that the bits correlate to the bytes in LSB order. So since $01 is 00000001 in binary, there should be an error that affects just 1 byte around 0x618-0x61F, in this case the c2 error should be on 0x61F. But when comparing both image dumps, there are TWO bytes that differ rather than one.

C2 is usually big endian (MSB) with some drive exceptions.
Also, you see two damaged bytes per 1 C2 bit set only for sectors that are scrambled, I guess this is how scrambling implemented on hardware level.

Talk to me on Discord (superg#9200)

50

(17 replies, posted in General discussion)

Jackal wrote:

So my question for this discussion is:
Do we really need a "perfect offset" correction if there is no data loss with 0 offset (and no common write offset can be detected)? After all, we wont know how the disc was mastered and if the gaps are supposed to be silent.

Yeah, this is legit question, if no data is lost, we just end up with imperfect split sometimes. But that technically can be corrected just from BINs if needed.