Using both a "Rerelease" tag and date tag would be a lot of extra filename length added to indicate mostly the same thing... Maybe an abbreviation for rerelease could be used? On Discogs they use a quick "RE" or "Re" tag to indicate this.

[Edit: Nvm, misread. "Rerelease" handling is already established... Not sure how you might, or if you would even want to, abbreviate "beta."]

2

(0 replies, posted in General discussion)

I made a DAT to keep track of all the track 00's and track AA's that have been being generated for audio CDs recently. I'll update the file on a regular basis as new CDs are added.

[EDIT: Nvm, the DAT has been added as part of the No-Intro daily pack, so if you're looking for the most up-to-date version, please check there.]

3

(17 replies, posted in General discussion)

Yeah I did send you a link at one point. But you were pretty busy with redumper. I didn't have much of a chance to talk with you about it in any real nuanced or in-depth way like I was hoping to. And yeah there's probably been a lot added since then. It's been for the most part a perpetual work in progress since around April.

Anyway, I wasn't going to, but I think screw it, I'm just going to go over my general thinking process on the topic, and see what your guys' thoughts are in return. I've touched on these things to a certain extent with sadikyo, bikerspade, and superg like I said, but I'm very curious what input others like Jackal and F1ReB4LL might have as well.

First, just a quick general explanation of the spreadsheet: I started it originally to keep track of all the audio CDs I was finding that had non-zero data in the lead-in and lead-out, in order to have plenty of data and test cases to work from once testing started in earnest on putting audio CD offset auto-detection into practice. But one thing that kept bothering me was how extreme these supposed "offset" values were on some discs. The highest of these values were throwing off the track alignments on some of my CDs by up to 2 seconds... I don't think it's physically possible even, for glass mastering equipment to offset CD data by that much... The most likely explanation for these huge amounts of overflow data, as far as I can tell, is audio data being improperly trimmed by the audio mastering engineer at the studio, coupled with negligent red book compliance screening at the manufacturing facility.

But at the very least, it's evident there's some difference in origin & nature happening there, and it's just a matter then of figuring out a reliable way to distinguish between these two types of overflow data: One being overflow data due to standard, run-of-the-mill manufacturing offset, and the other overflow data due to sloppy mastering. So anyway, working on the spreadsheet, with that question in mind, I started just collecting data on all the discs I could, hoping that some useful patterns might present themselves.

I also, to establish some context for the problem, started keeping track of other previously confirmed offset values (from data track-based discs), along with those discs' ringcodes (with a focus on mastering SIDs). The correlation between these two factors is obviously not consistent enough to ever make any conclusive judgments from, but my thinking was that, at the very least, maybe this data could be used as a "quality check" tool of some kind. So when we encounter one of these extreme -40,000 or whatever "offsets," we could simply ask: "Okay, is there a basis for this offset in question in the realities of the manufacturing process that led to the creation of this disc?" i.e. "Does the LBR that created the glass master for this disc have any history of creating any other glass masters with this same strange offset?" With the very limited evidence available from the data contained in the audio CDs themselves, I thought that at the very least, this could be a very useful practical grounding for when we're approaching these types of very strange edge cases.

I also started inspecting CDs using superg's pregap "perfect offset" method and keeping track of the results from that. That was a huge revelation and as far as I know, is the only way to directly perceive on an audio CD itself what its original, true manufacturing offset was. The only downside is that it is unfortunately not readily visible on most audio CDs. It takes a very special arrangement of the data to be visible, and even when it is, oftentimes the evidence is not entirely clear-cut. But ultimately, I was able to use the pregap method to determine with reasonable confidence the true offset of about 10-15% of the CDs that I inspected. This data is all documented in the spreadsheet as well, including the track-by-track breakdown of each disc that I inspected that way.

There are also some other things recorded, such as PVDs for some data discs, notes on offset-related "alt" pressings (i.e. CDs that are identical to each other in all ways but offset) including the offset values that separate them, and various other bits of info.

Before I post the spreadsheet though, I want to first preface with explaining some of the primary concerns that have occurred to me as I've been putting all this data together.

The biggest thing that concerns me, I'll just say it plainly--and I don't mean it as a criticism towards anyone or anything, more just an observation of the ambiguity/difficulty of the problem--is that we have such a clean, and tight, and conclusive method for determining the original manufacturing offset of data track CDs, but then now that we come to audio CDs we may very well be left just resorting to somewhat of a "good enough" type of approach. There's so little evidence available to us to determine the true value for each disc, that it is almost justifiable to simply say, "well let's just shift what we can, capture all the data, and call it good."

The thing about that is though, with data track CDs we can of course determine the true offset directly and unambiguously, but even if we couldn't, no matter what the offset value we applied was (as far as I know, correct me if I'm wrong), it would still have no real tangible effect on the playback of the disc image itself; The file system is still accessed the same way and ultimately nothing in the user experience is changed.

With audio CDs on the other hand, when you adjust the offset between the audio data and the subcode data, it has a very direct and tangible effect on the playback of the CD. Namely, it changes the point at which the audio on the album starts, when it ends, as well as all the start and end points of the tracks in between, i.e. essentially it shifts the entire framework of the album. This is the type of thing that music collectors and enthusiasts are going to notice and care about when they're perusing our database, or listening to their favorite albums that they've dumped and preserved using our methods. Particularly those massive 20,000+ sample offsets, but even the smaller random values (e.g. -11, -17, etc.), being arbitrary like that, will likely irk many music purists and preservationists, if they can't be justified in any foundational way. Audio CD offsets are almost totally ambiguous like I said and as we well know, but I think that if anything, because of all those reasons, we should be being even more careful, even more restrained and discretionary than we are with data track CDs, when it comes to the types of offset values that we allow to be applied to them.

Couple that with the fact I mentioned earlier about some of those bigger overflow data values likely not even being a result of offset at all... Anyway I guess to sum up my basic point, in my opinion the number of samples that happen to be protruding from the program area is not justifiable evidence upon which to determine and correct for the manufacturer offset value, and due to the fact that the applied value can and does have a tangible effect on the accuracy of the playback experience that is preserved, we should be exercising all due restraint in making these types of changes to audio CD dumps.

I have a few more things to say in regards to this (and even a few ideas that might be workable to enhance our accuracy), but I don't want to bombard you with everything all at once, and I'd like to hear your thoughts on these specific concerns. I'll share the spreadsheet itself in my next post, but for now, thank you very much for reading. Cheers.

[EDIT: Got ignored. Well for posterity's sake, and so it doesn't go to waste, here's the spreadsheet. Audio CD test data, as well as sort of a rough draft of some other things I was working on. Maybe will come back and finish at some point just for fun.]

https://docs.google.com/spreadsheets/d/1Gknkby9nF3hW5CpVeVsPFJCn4gyADhLR8HF0LNRpgMU/edit?usp=sharing

4

(17 replies, posted in General discussion)

I don't really have much to add to this I suppose, although thanks for moving it to general. Very interesting to follow your discussion on the topic. You guys have already done most of the hashing out it seems like, so I'm kind of in a "too little, too late" sort of position. But I do still have a fairly sizable unshared spreadsheet where I've documented the majority of my testing and overall work with this problem. Seems a waste of a lot of energy and hours to not at least make it available somewhere where it could potentially be useful, rather than just letting it rot away in my Google Drive.

But I don't want it to get lost in the shuffle of making site updates with iRobot, because I think it does deserve a cursory perusal at the very least. So I'll wait to post the actual document, but I want to give this notice of my interest in the forum post now, so the topic doesn't drift completely out of mind. Thank you. Cheers.

5

(3,516 replies, posted in General discussion)

Thank you both for the replies and clarification.

6

(2 replies, posted in General discussion)

In the case of this Kirby disc, when you popped it into an actual CD player, would it likely start playing at +12 samples already into the audio?

7

(3,516 replies, posted in General discussion)

This is good to hear. That saves me a lot of concern. What had me worried though, wasn't so much the potential lost data in the lead-in and lead-out (which I know does happen but is uncommon, and when it does is fairly easy to detect), but the alignment of the transitions between tracks. Those types of discrepancies seem like they would be much harder to detect, but slightly more significant to the integrity of the album itself. But I'm glad to hear that his claims were disputed. Sticking to the established AR convention seems like a sound route to me. : )

I do have a couple further questions, and a related request, if it's not too much trouble. The first question is in regards to pressing offsets. I know Nova already spoke to you a little bit about some of my questions there, Jackal, and I was pretty relieved by your response tbh. Cleared up a misunderstanding I had and put a big question to rest, thankfully. But I'm still wondering about one aspect of it. From what I understand, the table of contents at the beginning of an audio CD lays out all the LBAs (or timestamps?) for tracks throughout the disc. So when a disc has an offset pressing, say shifted +88 from another similar release, are the LBAs/timestamps in the TOC shifted by +88 as well?

And the second question is in regards to data potentially being shifted into the lead-in or lead-out... Would it be possible to add a feature to DIC where it automatically searches the lead-in/lead-out for non-zero bytes? And then if necessary adjusts accordingly (or at least tells you how to manually adjust accordingly)? Is there any technical limitation to something like that? If it is indeed possible, I'll go ahead and submit a request on the github, but if not, I don't want to waste anyone's time. Thanks again.

8

(3,516 replies, posted in General discussion)

Hey there, I'm somewhat new to the community, but I have a question about audio CD drive offsets. I just discovered last night the old forum post where a guy called out EAC and AccurateRip for their drive offset references all being 30 samples off. I know this post made a pretty big splash in the field when it appeared back in 2006, so I'm assuming the DIC creators were aware of it... I'm just curious what the general opinion is of this info amongst the MPF/DIC tech crowd? I'm still very much a learner when it comes to all of this stuff, so beginner-friendly language would be very appreciated :P Thank you for your time.

I have a question in regards to the discussion in this topic: http://forum.redump.org/topic/27775/add … oundtrack/

In this post, Enker mentions that it's not possible for a program like DIC to determine the write offset for most audio CDs. But I got to thinking about it... If that's the case, how is it that AccurateRip was able to implement its "cross-pressing verification" feature? When I use CueTools or dbPoweramp to rip/verify discs, those programs immediately know how the disc corresponds to other discs in the database, even those with different offsets. CueTools will even tell you specifically the offset number of that particular pressing. Is Enker mistaken here? Or is there something else at play with AR's method that doesn't work with our approach here at Redump with DIC?

10

(0 replies, posted in General discussion)

Would it be not too much trouble to add a page where users can see (and potentially edit) all of their pending "New Disc" dumps? Just for if you make a mistake, decide to include an overlooked detail, or forget to add a link to the logs or something. Also, it would just be handy to have an easy place to refer to when determining what you still need to submit. Not a huge issue, but would definitely be an appreciated convenience. Thank you for your time.