PAL PlayStation boots the disc just fine (and it's playable).

JP PlayStation doesn't boot the disc.

Ok, couple more findings:
Indeed, Gekido has a backup libcrypt key in audio subchannel, updated.
Net Yaroze disc doesn't have mirror or backup keys, just 6 bits in 6 first libcrypt sectors, updated, .SBI or .LSD can be downloaded from there.

I've also updated my redumper libcrypt detection code so it should be a little bit better although theoretically there might be false positives. nocash - thanks for all the hints!

Lastly I have yet to find my PAL PSX to check if the disc boots, chances are low but I'll try to finalize it this week.

Just the heads up, retail US console (not modded) doesn't boot it, tells to insert PlayStation format disc.

nocash wrote:

Yes! Something like a CloneCD .SUB file should do it. Or better, if you can filter out the intact sectors: Something like an .SBI or .LSD file, or some .TXT file like the list with the "Sectors with LibCrypt protection" on the redump pages (but with ALL scratched or modified sectors, for the whle disc from minute 0 and up).

PM with link sent, sorry I didn't notice you asked to filter out good sectors, if that's show stopper let me know.

I don't know who has invented that .LSD format, but it is already on redump. For example: on at top of the page it does show "Download: SHA1 • MD5 • SFV • Cuesheet • SBI" and at the bottom it shows "Sectors with LibCrypt protection" as so "03:08:05    41 01 01 07 06 05 00 23 08 05 ff b8" (which includes the CRC16 bytes "FF B8", which are only found in LSD files, not in SBI files).

Ha, don't know how could I miss that download link from redump for all these years wink. Yeah I think it's generated from the libcrypt text data that we paste into the database (same as .SBI).

Well, get rid of the _LIBCRYPT_SECTORS_MEDIEVIL table, and always use _LIBCRYPT_SECTORS_BASE for everything including MediEvil.
Then there's that "exactly 8 of 16 sectors are changed" convention, and the detection could rely on that (or maybe the LIBCRYPT_SECTORS_COUNT code already does something alike, I understand that part of the code).
If the backups exist, check that each group of 2 or 4 sectors has the same state (all changed or all not changed), for example, look at the table here: … bcrypt.htm the 4 sectors for "bit15" should be all having the same state.
If you want to support scratched discs, release the backup rules a bit.
A "1" bit must have errors in all 4 sectors.
A "0" bit shouldn't have errors on any of the 4 sectors.
A "0" bit could have some errors on scratched discs, you could filter them out, show warnings, and raise the warning level depending on the nunber of scratches.

Yeah, I could do something like that, hopefully this weekend.

I am wondering if they really don't have Libcrypt backups on minute 9, or if the dumping hardware/software is just unable to find them.

I can check if Gekido has libcrypt backup in audio tracks, have the disc here. Also, just for science, we have a modified ASUS drive firmware that can read all the way into leadout, if there will be some super small Track 1 PAL game, leadout subchannel can be verified as well.

Yes! Something like a CloneCD .SUB file should do it. Or better, if you can filter out the intact sectors: Something like an .SBI or .LSD file, or some .TXT file like the list with the "Sectors with LibCrypt protection" on the redump pages (but with ALL scratched or modified sectors, for the whle disc from minute 0 and up).

Will prepare the file over the weekend.

There's that myth(?) that the Yaroze Demo disc didn't work on retail consoles. If you have a PSX retail console around, could you give it a try? And mention if it was a EU/US/JP console, with/without modchip.

I also know this and I've seen references to SCEW wobble in some of the PsNee ports, but long time ago rama said that it was irrelevant. At some point I had this idea just to mod PSX with a chip that will sniff on a bus where wobble is reported and decode what disc sends, that would greatly help identifying region locks for cases where we don't really know.
Will also check the disc over the weekend as I have to dig out my PAL PSX from some storage boxes.

NB. I've just noticed that there are also .LSD files that do actually contain the whole 12-byte subchannel data (unlike SBI which had only 10-byte and did lack the CRC16). Nice. Where does .LSD format come from, also from redump?

Hmm, I'm not familiar with .LSD, probably not from redump. I can easily generate anything from redumper output, .SBI is lame, but that's the only way we can have it right now. Without going too much into the details, we currently have no way of improving the website because the communication with our keyholder is literally non-existent for years.

search *.bin -hex="03 08 02 05 03 08 01 04 03 09 53 56 03 09 52 55"

I could do something like this easily on my romset, this is a good idea just to identify potential missing candidates.

LibCrypt does usually read 16 key bits. The MediEvil detection does merely use a hardcoded table with only 8 sectors (plus unused 8 backup sectors). That's not very good.
It won't work with any yet undiscovered discs that use the same LibCrypt version, but with different key bits.
In fact, there are at least three known MediEvil (EUR) versions (english, german, spanish) that do use different keys - but the detection would only work for one of them.

I am well aware of that, all my Medievil EU retail copies have only one encoding scheme, that is hardcoded based on the only discs I have here. I'm very open for a better detection method, please suggest.

As of newer libcrypt protections, as you've already figured out, I have a list of all possible sector numbers (64 different sectors, 32 at minute 3 and backup 32 at minute 9) which might potentially be used by libcrypt and a very strong check that two invalid Q frames 5 sectors apart (anti scratch). The only thing I could potentially change is to account that minute 9 backup might be truncated but I remember I was checking Track 1 sizes for the whole library and I haven't found any candidate for that.

Also, subcode sucks. For minty PS1 discs, I get hundreds of Q crc errors and this is really the best situation because people dump all kind of discs in different conditions. That said, in Net Yaroze case I cannot really rely on 6 other Q sectors to be intact. Let me think on this a bit.

By the way, this is the list of all known libcrypt protected discs:

P.S. Let me know if you want/need RAW or processed subchannel for the Net Yaroze disc, I can share.

Hmm, something doesn't add up here, for Medievil ( I have the following sectors with subchannel errors:

MSF: 03:08:05
MSF: 03:18:49
MSF: 03:20:56
MSF: 03:21:55
MSF: 03:23:17
MSF: 03:25:03
MSF: 03:32:19
MSF: 03:34:51
MSF: 09:20:45
MSF: 09:30:63
MSF: 09:33:37
MSF: 09:35:52
MSF: 09:37:14
MSF: 09:38:58
MSF: 09:46:13
MSF: 09:48:59

They don't match your first table.

However Gekido ( matches your first table:

MSF: 03:08:05
MSF: 03:09:56
MSF: 03:13:10
MSF: 03:14:29
MSF: 03:15:24
MSF: 03:18:49
MSF: 03:20:56
MSF: 03:21:55
MSF: 03:23:17
MSF: 03:24:12
MSF: 03:25:03
MSF: 03:28:28
MSF: 03:32:19
MSF: 03:33:56
MSF: 03:34:51
MSF: 03:35:42

From preservation perspective, we only want to maintain a minimal set of modified subcode Q sectors that are needed to run the game. As most of the Q data is standard and can be easily generated.
We keep this modified data in form of .SBI files that are available for download.

The detection you describe is already implemented in redumper: … x.ixx#L241
Internally I maintain a list of all known libcrypt sectors, two schemes so far:
1. Medievil: 16 sectors with no backup
2. Other games: 16 (Gekido) or 32 sectors with each having a backup 5 sectors apart

For this Yaroze disc I'll likely add a 3rd check.

As of detecting EXE patterns, it's out of our preservation scope.

Thanks for the technical info nocash, very interesting!
From redump perspective I would like to detect this scheme when dumping and preserve subchannel of the altered sectors so we could potentially detect it on the other discs (if any).
I have that disc here.

Just for the reference, if that's MCN/ISRC between tracks case, it's already fixed in redumper, I am using P channel too.

I will check that later.

So this is a little bit more complicated for Dreamcast because of SD/HD area. Gluing SD and HD area makes a little sense for Dream but we follow the same rules as we set for other systems. Ideally we would want to preserve everything that comes between SD/HD area but there is a current technical limitation in doing that. For instance, SD area leadout contains unscrambled logo data (Sega CD art you can see on data side), I would even want to preserve it as well. We slowly evolve our methods but everything takes time.

The padding you refer to is 150 sectors at the start of Track 2.
At redump when we perform the track split (not only Dreamcast but all systems) we preserve gaps (index 0 entries) for each track. There are multiple reasons for that:
1. sometimes gaps aren't empty and have meaningful data (audio cd for example)
2. this allows for an easy merge of all tracks into one if needed (CloneCD .img) without losing any data.

In general, strives to be the most precise 1:1 optical media preservation and I don't see why would we favor lossy formats.


(3,507 replies, posted in General discussion)

withered.silence wrote:

I'm trying to create a dump for a Playstation 1 game with a PX-712A and the process seems to hit these kinds of errors:

[INFO] This drive has 295 offset in the c2. Changed to /s 2.
This drive supports [OpCode: 0xd8, SubCode: 0]
This drive supports [OpCode: 0xd8, SubCode: 1]
This drive supports [OpCode: 0xd8, SubCode: 2]
This drive supports [OpCode: 0xd8, SubCode: 8]
Checking reading lead-in -> OK
Checking SubQ adr (Track)  1/ 1
Checking SubRtoW (Track)  1/ 1
Checking Pregap sync, msf, mode (LBA)  -2311
Scanning sector for anti-mod string (LBA)  72425/233476[F:ReadCDForScanningPsxAntiMod][L:2620] GetLastError: 121, Das Zeitlimit f³r die Semaphore wurde erreicht.

Please wait for 25000 milliseconds until the device is returned
lpCmd: a8, 00, 00, 01, 1a, ea, 00, 00, 00, 02, 00, 00
dwBufSize: 4096
[F:ReadVolumeDescriptor][L:669] GetLastError: 55, Die angegebene Netzwerkressource bzw. das angegebene Gerõt ist nicht mehr verf³gbar.

Please wait for 25000 milliseconds until the device is returned
lpCmd: a8, 00, 00, 00, 00, 10, 00, 00, 00, 01, 00, 00
dwBufSize: 2048
[F:ReadCDForFileSystem][L:833] GetLastError: 55, Die angegebene Netzwerkressource bzw. das angegebene Gerõt ist nicht mehr verf³gbar.

Please wait for 25000 milliseconds until the device is returned

Am I doing something obviously wrong here?

Edit: Tried it with another PS1 game and it fails in a similar fashion.

Use redumper smile


(3,507 replies, posted in General discussion)

sarami wrote:

"Write offset" is set to +670? I think the db needs to be fixed by your hash. Anyway, please contact reentrant.

Hey olaf, can you try redumper on this disc using the same drive where you get "This error can't be fixed by plextor drive. Needs to dump it by non-plextor drive and replace it"?


(14 replies, posted in General discussion)

sarami wrote:

I totally agree. Fixed dump page (e.g. don't need and I want to delete it.

I just want to make sure we are on the same page here. "Fixed Dump" concept is different, it's not "fixed descrambling", it's split according to the offset shift. Check the offset table I posted for If you split according to these offsets, you will have a fully functional image. Otherwise if you split using only offset from the first track, the image will not work on emulators and when burned in any mode.

sarami wrote:

(3) - no sync, I'd say it's audio. But I'd add one exception to that rule.
Then, sector with damaged sync is "audio" ok?

Yes, I would agree to that.

sarami wrote:

Tell me the url of the database of this site.

Sure, I'll check my test dumps and share it.

sarami wrote:

I think so. Some Sega Saturn and CD-i ready disc (and etc.) have it. (e.g., )

Yeah, also a couple PSX discs have this data spillover to audio.

sarami wrote:

You say, "subchannel based split we should use only data/flags from subchannel". adopts "subchannel based split" except for TOC vs. Subs desync disc. Then (7),(8) should be descrambled in accordance with subchannel and (4),(5) should not be descrambled in accordance with subchannel.

No I don't think we do what you describe. Maybe it simply wasn't discussed before in detail and we just follow how it was implemented in DIC. TOC/subchannel mismatch is very confusing for everybody and we absolutely have to clarify and formalize it.

Here's what data we collect for cue sheet / track split and where it's available:
1. Count of sessions (available in TOC but can be derived from subchannel too if other session lead-in is included)
2. Count of tracks (available in both TOC/subchannel)
3. Data track flags: data/audio, 4ch, dcp, pre (available in both TOC/subchannel)
4. Track index 01 (available in both TOC/subchannel)
5. Other indices: index 00, index 02+ (available only in subchannel)
6. MCN/ISRC (available in both TOC/subchannel)
7. other CD-TEXT (available only in TOC)

As you can see from this list, TOC and subchannel share almost everything
TOC: (1)(2)(3)(4)(6)(7)
subchannel: (1)(2)(3)(4)(5)(6)

For TOC based split, the primary source of truth is data from TOC.
On the other hand, for the subchannel based split, the source of truth will be data from subchannel.
It's only logical to follow this rule for every data type that we extract as it removes the confusion and separates the concepts.

At saying that we prefer TOC basically means that all data from TOC should have highest priority.


(14 replies, posted in General discussion)

sarami wrote:

I agree if admin and other mods agree. I think TOC and SubQ and sync should be checked.
1. Data track on TOC, Data sector on SubQ and sync is valid --- it's apparently "data" and there is no room for discussion.
2. Audio track on TOC, Audio sector on SubQ and no sync ---  it's apparently "audio" and there is no room for discussion.
3. Data track on TOC, Data sector on SubQ, but there is not a sync (or sync is damaged) --- It's "data" or "audio"?
4. Data track on TOC, Audio sector on SubQ, there is not a sync (or sync is damaged) --- It's "data" or "audio"?
5. Data track on TOC, Audio sector on SubQ, there is a sync --- It's "data" or "audio"?
6. Audio track on TOC, Audio sector on SubQ, but there is a sync --- It's "data" or "audio"?
7. Audio track on TOC, Data sector on SubQ, there is not a sync (or sync is damaged) --- It's "data" or "audio"?
8. Audio track on TOC, Data sector on SubQ, there is a sync --- It's "data" or "audio"?

Some general things first. I think we should totally separate TOC and subchannel things. Some data tracks are marked audio in TOC and data in subchannel in Photo CD. Some CD-I have hidden track in subchannel that is not listed in TOC. Jaguar often has different track flags in TOC vs subchannel.
As we have two concepts here, primary TOC based split and secondary subchannel based split (Subs Indices).
For TOC based split we should use only data/flags from TOC, for subchannel based split we should use only data/flags from subchannel. The only data we want to use from subchannel for a TOC based split is when we're finding track split points as this data is not present in TOC.

(1),(2) - strongly agree

(3) - no sync, I'd say it's audio. But I'd add one exception to that rule. There are some CD-I discs where whole sync is zeroed but data is scrambled, it's more than one disc, so ideally:
if(standard_sync || zeroed_sync && expected_scrambled_msf) it's data. In general I find MSF is a very handy and strong check for scrambled.

(6) - definitely audio

(4),(5),(7),(8) - this is a source of discrepancies because of TOC subchannel properties mix.


(14 replies, posted in General discussion)

Jackal wrote:

- - Does the original disc play on a CD-i? What about a backup of the unfixed dump?

Original definitely works on CD-i. As of unfixed backup - it boils down to whether scrambled image can be burned as is, I'm not too familiar with writing limitatio ns. If you can't write scrambled, burned unfixed dump will not work as it will try to rescramble data track and it will be garbage.

- - This has 8.848 errors despite being fixed? What's going on?
All these 8848 sectors are zeroed even after shift correction, this is normal.

Jackal wrote:

There are some clear cases mentioned in the topic where bad mastering is causing dumps to descramble incorrectly and creating tons of erroneous data sectors, because there's some samples missing or added at random positions in a data track. That's the main focus of this topic, right?

Yes, this is correct.

Jackal wrote:

I don't remember if this also makes the original discs non-functional or if the drive performs some sort of on-the-fly correction to output a correct sector?

Yes I think the same. Drive seeks for sync frame, if it's not finding it I think it tries to reposition so these discs work on players and PC. Actually a good experiment would be to dump such disc using BE opcode as data and see what drive returns - will do that.

Jackal wrote:

And as F1ReB4LL was pointing out also on Discord, there seem to be many cases of discs with scrambled data in the Track02 pregap after offset correction, for example: + +
And if I remember correctly, this disc also has garbage at the start of the audio. If you remove the bytes, the track matches the PS1 track, so it seems to have been ripped from the PS1 version. And IIRC the same was true for Fighting Force PC vs. PS1. But I'm not sure anymore, as it's 13-15 years since those were first dumped, time sure flies yikes
It's unclear whether this is caused by for example the gold master disc being a CD-R that was burned with track-at-once or something, but the most logical explanation is that an audio track was copied with offset garbage and then burned again. But this is a different issue then that we don't have to discuss here?

Exactly, this is known issue to me and it even happens to some official PSX discs. These we shouldn't touch anyways as it's part of the audio.

Jackal wrote:

IIRC Truong / Ripper theorized that erroneous sectors with garbage bytes at the end of a data track were the result of a "split sector" or "half sector" or whatever they called it, that is part data / part audio tongue If you check the scrambled output, is it data and zeroes interleaved or does the data stop at some position and is it only zeroes after that?

So yes it does look like it's a transition from data to audio e.g. from scrambled to unscrambled, but there are some byte artefacts, I will make some hex screen captures later.

Jackal wrote:

But errors at the end of the data track also seem to be a different issue and since the remainder of the disc is audio tracks, performing offset shift corrections for such discs does not improve the dump in any meaningful way?

So this effect happens also between data tracks (no audio tracks), - this disc has only data tracks, second track is CDXA video or something.

Jackal wrote:

There were some examples recently where DIC was leaving sectors scrambled inside a data track with correct sync/header and mostly correct data, resulting in different dumps than before. So the default descrambling behavior must have been changed by sarami at some point or it's a bug. If a sector is inside a data track and the vast majority of it is data, IMO there's no sense in leaving it scrambled and the descrambled data is indeed more meaningful.

Yes I saw that, at some point I think sarami changed something in DIC. This is the most important thing I'm trying to "fix" here. Regardless of applying shift correction, or not - I think we should not rely on sector content (or rely less) when deciding whether we should unscramble sector or not. It's impossible to come up with good decision algo if sector is partially damaged as pretty much any byte can be damaged and this shift issue here clearly demonstrate that.


(14 replies, posted in General discussion)

F1ReB4LL wrote:

That's an incorrect term. There's only 1 offset per disc, while you're talking about leftovers from earlier burning/dumping mastering stages. Those leftovers are physically present on the disc and need to be kept, since they belong to the disc data.

It does look to me like mastering issue, I don't know the nature of it, but the reality is that throughout the disc data "shifts". I think "shift" term is correct to use as it shifts data in relation to to previous position. Also, there are some data leftovers, but that's incomplete sector. If you analyze these transitional sectors, you will see that it gradually screws up pairs of bytes in sample aligned to sample size.
Propose better term smile, we have to call it something.

F1ReB4LL wrote:

All 3 are wrong, since those leftovers often contain duped data and a plain descrambling will lead to duped sectors. Also, there are quite a lot of cases when those leftovers appear as a part of a first post-data audio track, like

I don't think we are on the same page here. There are no duped sectors and it's not random garbage. They are real damaged sectors with reasonable LBA. Do you want to see some bytes? I believe I can demonstrate.

F1ReB4LL wrote:

The main dump should be always left as is, with anything unusual left scrambled.

This is (1), you're talking about, this is exactly what you want. But I am saying that this is not what DIC does today and sectors from that damaged portion are half scrambled half descrambled.


(14 replies, posted in General discussion)

user7 wrote:

Which option produces the most functional dump?

Probably any option, transitional sectors are usually in the end of the track and most likely originally zeroed and unused.

EDIT: to clarify, the most functional one would be any out of these 3 proposed options but with "--correct-offset-shift" option on - that's probably what you wanted to hear! smile


(14 replies, posted in General discussion)

As some of you are already aware, some CD's have a mastering issue where write offset changes across the disc. For the standardization purpose, I will be calling that "offset shift".
Historically we knew of a couple examples such as:
Philips Media Presents: CD-i Games: Highlights-Previews-Gameclips:
The Cranberries: Doors and Windows:
CD-I CD Interactive:

Working on redumper lead-in/lead-out dumping functionality for the data discs I noticed that offset is actually shifting multiple times starting from the lead-in and even propagates to the lead-out. Analyzing this for discs with data track is possible due to the fact that if the first disc track is a data track, lead-in is usually scrambled empty sectors, and, respectively, if last disc track is a data track, the following lead-out track is also scrambled empty sectors.

Example for

track 1 {  data }
  index 00 { LBA: [  -150 ..     -1], length:    150, MSF: 00:00:00-00:31:01 }
  index 01 { LBA: [     0 ..   2176], length:   2177, MSF: 00:02:00-00:31:01 }
track 2 {  data }
  index 00 { LBA: [  2177 ..   2324], length:    148, MSF: 00:31:02-21:55:39 }
  index 01 { LBA: [  2325 ..  98514], length:  96190, MSF: 00:33:00-21:55:39 }
track A {  data }
  index 01 { LBA: [ 98515 ..  98612], length:     98, MSF: 21:55:40-21:56:62 }


LBA: [ -2259 ..   -150], offset: -294, count: 2110
LBA: [  -149 ..   2175], offset: -613, count: 2325
LBA: [  2176 ..  98514], offset: -609, count: 96339
LBA: [ 98515 ..  98614], offset: -882, count: 100

As you can see from this example, first offset shift happens between lead-in and pre-gap and others are "between" tracks although a little bit imprecise. As lead-out internally is just another track, it propagates there too.

Digging deeper I uncovered that there is many more of such offset shifting discs and most, if not all PC data discs where couple of the last data track sectors are "corrupted" (descramble failed) are actually offset shifting discs. As redumper outputs detailed descramble statistics, I was contacted numerous times by different people including our mods to check a particular data dump log to make sure it is correct and analyzing these cases I realized it's the same offset shifting issue.

Why this is important?
Every offset shifting transition goes across multiple sectors gradually and due to some peculiar mastering detail that we don't know yet, these sectors are randomly corrupted. Such corruption makes it difficult for the dumping software to decide on what to do with such sectors and whether to attempt to descramble it.
As my recent findings hint that there are a lot of such discs, the purpose of this topic is to standardize how do we preserve such transitions so it follows preservation standards and is uniform across dumping software (which is basically, DIC and redumper lol).

As of today, redumper dumps such discs with one global disc write offset which is detected based on the first sector of the first data track (simplified). This is the default behaviour.
In addition to that, in redumper I provide an option "--correct-offset-shift", which follows offset shift changes, and such a dump can be added to as (Fixed Dump) edition. Regardless of using this option or not, we need to standardize our handling of such transitions.

Here's how that can be handled:
1. Leave transitional sectors intact.
2. Force descramble of all transitional sectors
3. Intelligently detect if the sector is scrambled based on a combination of content criteria and if it is, try to descramble it

Right now, both DIC and redumper are doing a variation of (3). More often than not, this descrambles some sectors and leaves other sectors intact e.g. you get a mix of god knows what and there is no way to recover scrambled content that is 1:1 with the original. In addition to that, redumper does it differently and that allows to descramble "better", but this is not the point here. The point is that (3) doesn't yield consistent results and these results aren't 1:1 aligned with the source (scrambled) material.

On the other hand (2) is the sweet spot as it is consistent and primary scrambled image can be reconstructed 1:1.

Finally, (1) is a compromise where we lose 1:1 but keep some sort of consistency.

I would like to hear opinions on this. Just please, let's keep on topic, I don't want the conversation to go elsewhere.


(3 replies, posted in News)

Lugamo wrote:

Do you know if MPF will add Redumper support in the future?

Yes, it's currently being built.


(3,507 replies, posted in General discussion)

sarami wrote:
superg wrote:

it belongs to the current track, not to the next one.

No. See other tracks.
It's not always but P channel of 1st sector of the track and 2nd sector of the track is different typically.

I understand your point that P change is not always an indicator of a track change, no objections here. However this is supplemental thing here.

Main problem is that Mode 2 Q (MCN) @ LBA 47437 separates track 15 and track 16. Here are the reasons why MCN should always belong to the previous track:

1. When playing the music, ordinary audio CD player plays it from left to right, MCN/ISRC frames repeat each 1/100 frames and are informational. Current track number is stored as a current state of audio player and it switches only when it changes in in Mode 1 Q which will happen after the MCN/ISRC as current track information is not stored in Mode 2 or Mode 3 Q.

2. New track always starts from Mode 1 Q. I think I saw that somewhere in rainbow books, unfortunately I cannot find it right now. Although this statement is implied throughout documentation, for instance open MMC-3 working draft page 26:
" ADR=3 (0011b) – Mode-3 Q"
"The ISRC may only change immediately after the Track Number (TNO) has been changed."

I understand you're trying to do some guesswork based on 150 standard gap size or even establishing that P is shifted 1 sector for each track so start is always shifted in relation to that or something but that is unsafe.


(3,507 replies, posted in General discussion)

sarami wrote:
bikerspade wrote:

A relatively recent dump of a particular PCECD disc (Metamor Jupiter) produced a bad track split, where the hashes for track 15 and track 16 are incorrect.

DB is incorrect. The pregap of the track 16 is 00:03:00, not 00:02:74.

LBA[047436, 0x0b94c]: P[00], Q[01150100386600103436d96b]{Audio, 2ch, Copy NG, Pre-emphasis No, Track[15], Idx[01], RMSF[00:38:66], AMSF[10:34:36]}, RtoW[0, 0, 0, 0]
LBA[047437, 0x0b94d]: P[00], Q[0200000000000000003767c1]{Audio, 2ch, Copy NG, Pre-emphasis No, MediaCatalogNumber [0000000000000], AMSF[     :37]}, RtoW[0, 0, 0, 0]
LBA[047438, 0x0b94e]: P[ff], Q[01160000027300103438dcb1]{Audio, 2ch, Copy NG, Pre-emphasis No, Track[16], Idx[00], RMSF[00:02:73], AMSF[10:34:38]}, RtoW[0, 0, 0, 0]

LBA 47437 belongs to the track 16.

sarami, LBA 047437 belongs to track 15. First of all, standard says that if track or index ends with Q without positional information it belongs to the current track, not to the next one. I'm too lazy to search through it but trust me.
Next, check P values from your subchannel decode, P[00] for LBA 047437 shows that it belongs to the previous track.
The dump in DB is correct.


(20 replies, posted in General discussion)

Deterous wrote:

I've attached a zip of a list of serials for each BIOS. It would be useful for knowing which consoles supports which game in terms of backwards compatibility, etc.

Is the order of serials untouched in these files? I wonder if we can deduce some information based on the serials initial ordering e.g. if these were grouped based on some criteria in the firmware.