I will check that later.

So this is a little bit more complicated for Dreamcast because of SD/HD area. Gluing SD and HD area makes a little sense for Dream but we follow the same rules as we set for other systems. Ideally we would want to preserve everything that comes between SD/HD area but there is a current technical limitation in doing that. For instance, SD area leadout contains unscrambled logo data (Sega CD art you can see on data side), I would even want to preserve it as well. We slowly evolve our methods but everything takes time.

The padding you refer to is 150 sectors at the start of Track 2.
At redump when we perform the track split (not only Dreamcast but all systems) we preserve gaps (index 0 entries) for each track. There are multiple reasons for that:
1. sometimes gaps aren't empty and have meaningful data (audio cd for example)
2. this allows for an easy merge of all tracks into one if needed (CloneCD .img) without losing any data.

In general, redump.org strives to be the most precise 1:1 optical media preservation and I don't see why would we favor lossy formats.

4

(3,497 replies, posted in General discussion)

withered.silence wrote:

Hello,
I'm trying to create a dump for a Playstation 1 game with a PX-712A and the process seems to hit these kinds of errors:

[INFO] This drive has 295 offset in the c2. Changed to /s 2.
This drive supports [OpCode: 0xd8, SubCode: 0]
This drive supports [OpCode: 0xd8, SubCode: 1]
This drive supports [OpCode: 0xd8, SubCode: 2]
This drive supports [OpCode: 0xd8, SubCode: 8]
Checking reading lead-in -> OK
Checking SubQ adr (Track)  1/ 1
Checking SubRtoW (Track)  1/ 1
Checking Pregap sync, msf, mode (LBA)  -2311
Scanning sector for anti-mod string (LBA)  72425/233476[F:ReadCDForScanningPsxAntiMod][L:2620] GetLastError: 121, Das Zeitlimit f³r die Semaphore wurde erreicht.

Please wait for 25000 milliseconds until the device is returned
lpCmd: a8, 00, 00, 01, 1a, ea, 00, 00, 00, 02, 00, 00
dwBufSize: 4096
[F:ReadVolumeDescriptor][L:669] GetLastError: 55, Die angegebene Netzwerkressource bzw. das angegebene Gerõt ist nicht mehr verf³gbar.

Please wait for 25000 milliseconds until the device is returned
lpCmd: a8, 00, 00, 00, 00, 10, 00, 00, 00, 01, 00, 00
dwBufSize: 2048
[F:ReadCDForFileSystem][L:833] GetLastError: 55, Die angegebene Netzwerkressource bzw. das angegebene Gerõt ist nicht mehr verf³gbar.

Please wait for 25000 milliseconds until the device is returned

Am I doing something obviously wrong here?

Edit: Tried it with another PS1 game and it fails in a similar fashion.

Use redumper smile

5

(3,497 replies, posted in General discussion)

sarami wrote:

"Write offset" is set to +670? I think the db needs to be fixed by your hash. Anyway, please contact reentrant.

http://forum.redump.org/topic/45004/audio-offset/

Hey olaf, can you try redumper on this disc using the same drive where you get "This error can't be fixed by plextor drive. Needs to dump it by non-plextor drive and replace it"?

7

(14 replies, posted in General discussion)

sarami wrote:

I totally agree. Fixed dump page (e.g. http://redump.org/disc/99290/) don't need and I want to delete it.

I just want to make sure we are on the same page here. "Fixed Dump" concept is different, it's not "fixed descrambling", it's split according to the offset shift. Check the offset table I posted for http://redump.org/disc/74810. If you split according to these offsets, you will have a fully functional image. Otherwise if you split using only offset from the first track, the image will not work on emulators and when burned in any mode.

sarami wrote:

(3) - no sync, I'd say it's audio. But I'd add one exception to that rule.
Then, sector with damaged sync is "audio" ok?

Yes, I would agree to that.

sarami wrote:

Tell me the url of the database of this site.

Sure, I'll check my test dumps and share it.

sarami wrote:

I think so. Some Sega Saturn and CD-i ready disc (and etc.) have it. (e.g. http://redump.org/disc/58172/, http://redump.org/disc/35804/ )

Yeah, also a couple PSX discs have this data spillover to audio.

sarami wrote:

You say, "subchannel based split we should use only data/flags from subchannel". Redump.org adopts "subchannel based split" except for TOC vs. Subs desync disc. Then (7),(8) should be descrambled in accordance with subchannel and (4),(5) should not be descrambled in accordance with subchannel.

No I don't think we do what you describe. Maybe it simply wasn't discussed before in detail and we just follow how it was implemented in DIC. TOC/subchannel mismatch is very confusing for everybody and we absolutely have to clarify and formalize it.

Here's what data we collect for cue sheet / track split and where it's available:
1. Count of sessions (available in TOC but can be derived from subchannel too if other session lead-in is included)
2. Count of tracks (available in both TOC/subchannel)
3. Data track flags: data/audio, 4ch, dcp, pre (available in both TOC/subchannel)
4. Track index 01 (available in both TOC/subchannel)
5. Other indices: index 00, index 02+ (available only in subchannel)
6. MCN/ISRC (available in both TOC/subchannel)
7. other CD-TEXT (available only in TOC)

As you can see from this list, TOC and subchannel share almost everything
TOC: (1)(2)(3)(4)(6)(7)
subchannel: (1)(2)(3)(4)(5)(6)

For TOC based split, the primary source of truth is data from TOC.
On the other hand, for the subchannel based split, the source of truth will be data from subchannel.
It's only logical to follow this rule for every data type that we extract as it removes the confusion and separates the concepts.

At redump.org saying that we prefer TOC basically means that all data from TOC should have highest priority.

8

(14 replies, posted in General discussion)

sarami wrote:

I agree if admin and other mods agree. I think TOC and SubQ and sync should be checked.
1. Data track on TOC, Data sector on SubQ and sync is valid --- it's apparently "data" and there is no room for discussion.
2. Audio track on TOC, Audio sector on SubQ and no sync ---  it's apparently "audio" and there is no room for discussion.
3. Data track on TOC, Data sector on SubQ, but there is not a sync (or sync is damaged) --- It's "data" or "audio"?
4. Data track on TOC, Audio sector on SubQ, there is not a sync (or sync is damaged) --- It's "data" or "audio"?
5. Data track on TOC, Audio sector on SubQ, there is a sync --- It's "data" or "audio"?
6. Audio track on TOC, Audio sector on SubQ, but there is a sync --- It's "data" or "audio"?
7. Audio track on TOC, Data sector on SubQ, there is not a sync (or sync is damaged) --- It's "data" or "audio"?
8. Audio track on TOC, Data sector on SubQ, there is a sync --- It's "data" or "audio"?

Some general things first. I think we should totally separate TOC and subchannel things. Some data tracks are marked audio in TOC and data in subchannel in Photo CD. Some CD-I have hidden track in subchannel that is not listed in TOC. Jaguar often has different track flags in TOC vs subchannel.
As we have two concepts here, primary TOC based split and secondary subchannel based split (Subs Indices).
For TOC based split we should use only data/flags from TOC, for subchannel based split we should use only data/flags from subchannel. The only data we want to use from subchannel for a TOC based split is when we're finding track split points as this data is not present in TOC.

(1),(2) - strongly agree

(3) - no sync, I'd say it's audio. But I'd add one exception to that rule. There are some CD-I discs where whole sync is zeroed but data is scrambled, it's more than one disc, so ideally:
if(standard_sync || zeroed_sync && expected_scrambled_msf) it's data. In general I find MSF is a very handy and strong check for scrambled.

(6) - definitely audio

(4),(5),(7),(8) - this is a source of discrepancies because of TOC subchannel properties mix.

9

(14 replies, posted in General discussion)

Jackal wrote:

- http://redump.org/disc/74810/ - Does the original disc play on a CD-i? What about a backup of the unfixed dump?

Original definitely works on CD-i. As of unfixed backup - it boils down to whether scrambled image can be burned as is, I'm not too familiar with writing limitatio ns. If you can't write scrambled, burned unfixed dump will not work as it will try to rescramble data track and it will be garbage.

- http://redump.org/disc/99290/ - This has 8.848 errors despite being fixed? What's going on?
All these 8848 sectors are zeroed even after shift correction, this is normal.

Jackal wrote:

There are some clear cases mentioned in the topic where bad mastering is causing dumps to descramble incorrectly and creating tons of erroneous data sectors, because there's some samples missing or added at random positions in a data track. That's the main focus of this topic, right?

Yes, this is correct.

Jackal wrote:

I don't remember if this also makes the original discs non-functional or if the drive performs some sort of on-the-fly correction to output a correct sector?

Yes I think the same. Drive seeks for sync frame, if it's not finding it I think it tries to reposition so these discs work on players and PC. Actually a good experiment would be to dump such disc using BE opcode as data and see what drive returns - will do that.

Jackal wrote:

And as F1ReB4LL was pointing out also on Discord, there seem to be many cases of discs with scrambled data in the Track02 pregap after offset correction, for example: http://redump.org/disc/1770/ + http://redump.org/disc/1716/ + http://redump.org/disc/7986/
And if I remember correctly, this disc http://redump.org/disc/5479/ also has garbage at the start of the audio. If you remove the bytes, the track matches the PS1 track, so it seems to have been ripped from the PS1 version. And IIRC the same was true for Fighting Force PC vs. PS1. But I'm not sure anymore, as it's 13-15 years since those were first dumped, time sure flies yikes
It's unclear whether this is caused by for example the gold master disc being a CD-R that was burned with track-at-once or something, but the most logical explanation is that an audio track was copied with offset garbage and then burned again. But this is a different issue then that we don't have to discuss here?

Exactly, this is known issue to me and it even happens to some official PSX discs. These we shouldn't touch anyways as it's part of the audio.

Jackal wrote:

IIRC Truong / Ripper theorized that erroneous sectors with garbage bytes at the end of a data track were the result of a "split sector" or "half sector" or whatever they called it, that is part data / part audio tongue If you check the scrambled output, is it data and zeroes interleaved or does the data stop at some position and is it only zeroes after that?

So yes it does look like it's a transition from data to audio e.g. from scrambled to unscrambled, but there are some byte artefacts, I will make some hex screen captures later.

Jackal wrote:

But errors at the end of the data track also seem to be a different issue and since the remainder of the disc is audio tracks, performing offset shift corrections for such discs does not improve the dump in any meaningful way?

So this effect happens also between data tracks (no audio tracks), http://redump.org/disc/74810/ - this disc has only data tracks, second track is CDXA video or something.

Jackal wrote:

There were some examples recently where DIC was leaving sectors scrambled inside a data track with correct sync/header and mostly correct data, resulting in different dumps than before. So the default descrambling behavior must have been changed by sarami at some point or it's a bug. If a sector is inside a data track and the vast majority of it is data, IMO there's no sense in leaving it scrambled and the descrambled data is indeed more meaningful.

Yes I saw that, at some point I think sarami changed something in DIC. This is the most important thing I'm trying to "fix" here. Regardless of applying shift correction, or not - I think we should not rely on sector content (or rely less) when deciding whether we should unscramble sector or not. It's impossible to come up with good decision algo if sector is partially damaged as pretty much any byte can be damaged and this shift issue here clearly demonstrate that.

10

(14 replies, posted in General discussion)

F1ReB4LL wrote:

That's an incorrect term. There's only 1 offset per disc, while you're talking about leftovers from earlier burning/dumping mastering stages. Those leftovers are physically present on the disc and need to be kept, since they belong to the disc data.

It does look to me like mastering issue, I don't know the nature of it, but the reality is that throughout the disc data "shifts". I think "shift" term is correct to use as it shifts data in relation to to previous position. Also, there are some data leftovers, but that's incomplete sector. If you analyze these transitional sectors, you will see that it gradually screws up pairs of bytes in sample aligned to sample size.
Propose better term smile, we have to call it something.

F1ReB4LL wrote:

All 3 are wrong, since those leftovers often contain duped data and a plain descrambling will lead to duped sectors. Also, there are quite a lot of cases when those leftovers appear as a part of a first post-data audio track, like http://redump.org/disc/7986/

I don't think we are on the same page here. There are no duped sectors and it's not random garbage. They are real damaged sectors with reasonable LBA. Do you want to see some bytes? I believe I can demonstrate.

F1ReB4LL wrote:

The main dump should be always left as is, with anything unusual left scrambled.

This is (1), you're talking about, this is exactly what you want. But I am saying that this is not what DIC does today and sectors from that damaged portion are half scrambled half descrambled.

11

(14 replies, posted in General discussion)

user7 wrote:

Which option produces the most functional dump?

Probably any option, transitional sectors are usually in the end of the track and most likely originally zeroed and unused.

EDIT: to clarify, the most functional one would be any out of these 3 proposed options but with "--correct-offset-shift" option on - that's probably what you wanted to hear! smile

12

(14 replies, posted in General discussion)

As some of you are already aware, some CD's have a mastering issue where write offset changes across the disc. For the standardization purpose, I will be calling that "offset shift".
Historically we knew of a couple examples such as:
Philips Media Presents: CD-i Games: Highlights-Previews-Gameclips: http://redump.org/disc/74810/
The Cranberries: Doors and Windows: http://redump.org/disc/99290/
CD-I CD Interactive: http://redump.org/disc/97023/

Working on redumper lead-in/lead-out dumping functionality for the data discs I noticed that offset is actually shifting multiple times starting from the lead-in and even propagates to the lead-out. Analyzing this for discs with data track is possible due to the fact that if the first disc track is a data track, lead-in is usually scrambled empty sectors, and, respectively, if last disc track is a data track, the following lead-out track is also scrambled empty sectors.

Example for http://redump.org/disc/74810:
TOC:

track 1 {  data }
  index 00 { LBA: [  -150 ..     -1], length:    150, MSF: 00:00:00-00:31:01 }
  index 01 { LBA: [     0 ..   2176], length:   2177, MSF: 00:02:00-00:31:01 }
track 2 {  data }
  index 00 { LBA: [  2177 ..   2324], length:    148, MSF: 00:31:02-21:55:39 }
  index 01 { LBA: [  2325 ..  98514], length:  96190, MSF: 00:33:00-21:55:39 }
track A {  data }
  index 01 { LBA: [ 98515 ..  98612], length:     98, MSF: 21:55:40-21:56:62 }

offsets:

LBA: [ -2259 ..   -150], offset: -294, count: 2110
LBA: [  -149 ..   2175], offset: -613, count: 2325
LBA: [  2176 ..  98514], offset: -609, count: 96339
LBA: [ 98515 ..  98614], offset: -882, count: 100

As you can see from this example, first offset shift happens between lead-in and pre-gap and others are "between" tracks although a little bit imprecise. As lead-out internally is just another track, it propagates there too.

Digging deeper I uncovered that there is many more of such offset shifting discs and most, if not all PC data discs where couple of the last data track sectors are "corrupted" (descramble failed) are actually offset shifting discs. As redumper outputs detailed descramble statistics, I was contacted numerous times by different people including our mods to check a particular data dump log to make sure it is correct and analyzing these cases I realized it's the same offset shifting issue.

Why this is important?
Every offset shifting transition goes across multiple sectors gradually and due to some peculiar mastering detail that we don't know yet, these sectors are randomly corrupted. Such corruption makes it difficult for the dumping software to decide on what to do with such sectors and whether to attempt to descramble it.
As my recent findings hint that there are a lot of such discs, the purpose of this topic is to standardize how do we preserve such transitions so it follows redump.org preservation standards and is uniform across dumping software (which is basically, DIC and redumper lol).

As of today, redumper dumps such discs with one global disc write offset which is detected based on the first sector of the first data track (simplified). This is the default behaviour.
In addition to that, in redumper I provide an option "--correct-offset-shift", which follows offset shift changes, and such a dump can be added to redump.org as (Fixed Dump) edition. Regardless of using this option or not, we need to standardize our handling of such transitions.

Here's how that can be handled:
1. Leave transitional sectors intact.
2. Force descramble of all transitional sectors
3. Intelligently detect if the sector is scrambled based on a combination of content criteria and if it is, try to descramble it

Right now, both DIC and redumper are doing a variation of (3). More often than not, this descrambles some sectors and leaves other sectors intact e.g. you get a mix of god knows what and there is no way to recover scrambled content that is 1:1 with the original. In addition to that, redumper does it differently and that allows to descramble "better", but this is not the point here. The point is that (3) doesn't yield consistent results and these results aren't 1:1 aligned with the source (scrambled) material.

On the other hand (2) is the sweet spot as it is consistent and primary scrambled image can be reconstructed 1:1.

Finally, (1) is a compromise where we lose 1:1 but keep some sort of consistency.

I would like to hear opinions on this. Just please, let's keep on topic, I don't want the conversation to go elsewhere.

13

(3 replies, posted in News)

Lugamo wrote:

Do you know if MPF will add Redumper support in the future?

Yes, it's currently being built.

14

(3,497 replies, posted in General discussion)

sarami wrote:
superg wrote:

it belongs to the current track, not to the next one.

No. See other tracks.
It's not always but P channel of 1st sector of the track and 2nd sector of the track is different typically.

I understand your point that P change is not always an indicator of a track change, no objections here. However this is supplemental thing here.

Main problem is that Mode 2 Q (MCN) @ LBA 47437 separates track 15 and track 16. Here are the reasons why MCN should always belong to the previous track:

1. When playing the music, ordinary audio CD player plays it from left to right, MCN/ISRC frames repeat each 1/100 frames and are informational. Current track number is stored as a current state of audio player and it switches only when it changes in in Mode 1 Q which will happen after the MCN/ISRC as current track information is not stored in Mode 2 or Mode 3 Q.

2. New track always starts from Mode 1 Q. I think I saw that somewhere in rainbow books, unfortunately I cannot find it right now. Although this statement is implied throughout documentation, for instance open MMC-3 working draft page 26:
"4.2.3.4.3 ADR=3 (0011b) – Mode-3 Q"
"The ISRC may only change immediately after the Track Number (TNO) has been changed."

I understand you're trying to do some guesswork based on 150 standard gap size or even establishing that P is shifted 1 sector for each track so start is always shifted in relation to that or something but that is unsafe.

15

(3,497 replies, posted in General discussion)

sarami wrote:
bikerspade wrote:

A relatively recent dump of a particular PCECD disc (Metamor Jupiter) produced a bad track split, where the hashes for track 15 and track 16 are incorrect.

DB is incorrect. The pregap of the track 16 is 00:03:00, not 00:02:74.

LBA[047436, 0x0b94c]: P[00], Q[01150100386600103436d96b]{Audio, 2ch, Copy NG, Pre-emphasis No, Track[15], Idx[01], RMSF[00:38:66], AMSF[10:34:36]}, RtoW[0, 0, 0, 0]
LBA[047437, 0x0b94d]: P[00], Q[0200000000000000003767c1]{Audio, 2ch, Copy NG, Pre-emphasis No, MediaCatalogNumber [0000000000000], AMSF[     :37]}, RtoW[0, 0, 0, 0]
LBA[047438, 0x0b94e]: P[ff], Q[01160000027300103438dcb1]{Audio, 2ch, Copy NG, Pre-emphasis No, Track[16], Idx[00], RMSF[00:02:73], AMSF[10:34:38]}, RtoW[0, 0, 0, 0]

LBA 47437 belongs to the track 16.

sarami, LBA 047437 belongs to track 15. First of all, standard says that if track or index ends with Q without positional information it belongs to the current track, not to the next one. I'm too lazy to search through it but trust me.
Next, check P values from your subchannel decode, P[00] for LBA 047437 shows that it belongs to the previous track.
The dump in DB is correct.

16

(20 replies, posted in General discussion)

Deterous wrote:

I've attached a zip of a list of serials for each BIOS. It would be useful for knowing which consoles supports which game in terms of backwards compatibility, etc.

Is the order of serials untouched in these files? I wonder if we can deduce some information based on the serials initial ordering e.g. if these were grouped based on some criteria in the firmware.

17

(20 replies, posted in General discussion)

SLES-01227
SLES-11227

It can be Command & Conquer: Red Alert or
Command & Conquer: Red Alert: Retaliation

The only 2CD game regional releases of which were scattered around the same time frame.
Reference: http://redump.org/discs/system/psx/sort/serial/?page=7

18

(20 replies, posted in General discussion)

Jackal wrote:

00977 number is claimed by the bonus disc, so it seems very unlikely that an unknown RE2 release exists with the same number.

It's marked as such in emulator databases, but doubt it's accurate.
Also second disc serial follows this exact pattern from the other regions.

19

(20 replies, posted in General discussion)

More research:

Resident Evil 2 (English)
SLES-00972
SLES-10972

Resident Evil 2 (French)
SLES-00973
SLES-10973

Resident Evil 2 (German)
SLES-00974
SLES-10974

Resident Evil 2 (Italian)
SLES-00975
SLES-10975

Resident Evil 2 (Spanish)
SLES-00976
SLES-10976

Resident Evil 2 (XXX)
SLES-00977
SLES-10977

Looks like we miss Resident Evil 2 dump of unknown European country.

20

(20 replies, posted in General discussion)

All right, I finally spent some time for this, and identified some

PSX Europe:
SLES-00977 - Resident Evil - Director's Cut - Bonus Disc (https://github.com/libretro/libretro-da … /ps1.idlst)
SLES-10977 - 2CD???

SLES-01227 - 2CD???
SLES-11227 - 2CD???

SLES-01894 - ???

PSX Japan:
SLPM-80639 - 2大ヒーロー スペシャルDisc 体験版 [2 Big Hero Special Disc Trial Version] (https://w.atwiki.jp/psemu/pages/124.html)
SLPS-00653

21

(20 replies, posted in General discussion)

Deterous wrote:

I've trawled through all known PS2 BIOS's and found a total of 784 mentioned serials, approximately 567 of which are PS1 and the remainder PS2. A few of the serials are not dumped and have unknown titles.

This is nice, thanks for doing that!
I'll try to check what's there a little bit later.

22

(20 replies, posted in General discussion)

Deterous wrote:

Way fewer (23 vs 198) serials listed in the PAL SCPH-9000x BIOS, at least that I could find.

I think that's because eventually they fixed some games playability and removed them from the compatibility list.
It's good to know that earlier bioses contain more titles though, if you have a better automatic way of extracting this, I'd suggest to iterate over entire PS2 BIOS catalog and merging the lists, it could be that some titles were added and then deleted, not 100% sure though.

23

(17 replies, posted in General discussion)

I think I already have your spreadsheet Into. Unless you added a lot since. Post the link here anyways.

24

(17 replies, posted in General discussion)

I updated the first post.

25

(17 replies, posted in General discussion)

Jackal wrote:

0. If there is no non zero data in pregap/lead-out, use 0 offset. Unless it's possible to manually detect the write offset with a reasonable degree of certainty, in which case combined offset correction can be used.

1. if there is non zero data in lead-out and that data can be fully shifted out of there (left) without spanning non zero data into pre-gap, correct offset with a minimum shift required

2. if there is non zero data in pre-gap and that data can be fully shifted out of there (right) without spanning non zero data into lead-out, correct offset with a minimum shift required

This is clear.


Jackal wrote:

Whenever a disc is dumped with offset correction, this should be documented in comments.

The non-zero offset will be specified in the ringcode entry, wouldn't that be enough?


Jackal wrote:

And then for the rare headache cases discussed in your last post where it's impossible to shift out data from lead-out/pre-gap (data is wider than allocated TOC space for it):

3. Use 0 offset and preserve relevant non-zero data in separate pregap.bin or leadout.bin. I don't see any advantage in trying to include this data with the main dump through a custom cuesheet format or whatever, but if it's decided otherwise, that's fine by me.

Yes, now I think this would be the best course of action. Separate files, size is sector aligned.


Jackal wrote:

And for the DC / PSX or other discs that have missing relevant TOC / pre-gap / lead-out data, we should also preserve this data in separate files (offset corrected if possible).

I already have this implemented in redumper, just have to walk over it and do some checks.


Jackal wrote:

As for offset matching and "universal" checksums: Audio checksum databases like AccurateRip and CUETools are already ignoring leading and trailing zero bytes, so they are essentially already storing "universal" checksums? I think this is beyond the scope of the Redump project and would require too much work and too many changes.

AccurateRip and CUETools are track based and they do it mainly to match tracks as far as I know - this is overkill for us.
What I was saying is not exactly that. In redumper, only for audio cd's I can generate let's say a SHA-1 hash of a non-zero data span, one hash per disc. That would be in the log-file. For a new submission of an audio disc, we add that to comments, example:
Universal Hash: 958b5a41456c90cb5c6df8d676c3d2d90a940609 (-647)
For the subsequent verifications of the same disc with a different write offset these hashes will match and this will be an indicator to us not to add a new disc but add another ringcode line in an existing entry. Just don't tell me we have too many things in comments (we do) but out of all stored and unneeded crap like volume labels, this particular thing would be the most useful.


Jackal wrote:

Guess we still need to figure out how add the separate files in the database, with iR0b0t not around. Maybe resort to storing .dat or checksums in comments for now, similar to Xbox PFI/DMI/SS.

By the way, can we add extra files to XML list but exclude them from CUE-sheet, would that work?