pnkiller78 wrote:

The sector in CDReader with the same sync/header that in px_d8 is sector 2.. look at this screenshoot, what that means, did I my calculations wrong?

edit: I just remember something.. older plextor drives have a bug where the normal read mode outputs a different sector offset than d8 mode.. I'm pretty sure that +222 is the correct write offset after all.. but if possible try confirming the offset using the old method..

I hope this doesn't affect any other users and any current dumps in the database..

Wrong.. you have to get cdreader: http://www.cdtool.pwp.blueyonder.co.uk/ … 1_2b20.zip and compare the complete sync/header:

Use 'View Sectors' to go to the first sector of the data track, then enable the 'Apply YB scrambling' box. This will scramble the header (the sync/header is now the same as the px_d8 output). Then you can determine the offset in sectors by looking for the sector with the same sync/header in cdreader.

I think the write offset is just +222 (and it's a common IBM PC offset)

themabus wrote:

i wish it would be true smile ...since there was no lc on jpn psx and all, it would be something. i actually dump subchannel twice but so far besides Q-ADDRESS=2 and messed up layout i haven't noticed anything unusual. i guess that would mean non zero RSTUVW channels. in that case it should be easy to scan throug. i'll write a program and give it a try.
about 1st and last track it's true they are very alike but since we read in raw and last track hasn't got those empty secotrs at the end it would not show in db, only sizes are close.
gaps - well, what can we do? EAC does not pick them up. and since those last 4 or 5 cds i've added yesterday i'm actually now 99% sure this is the right thing to do. along with everything i posted before ther's one more case when EAC would fail - it's when absolute time frame at the end of gap repeats twice - EAC would decrease it by a frame. and about those 02:74 gaps when Audio changes into Data - ther's always  75 empty audio sectors and 150 empty data (75 empty audio and something, some bytes to fill last sector with audio data). only difference with 3:00 second cds is that 1st empy audio sector is not set as a gap in subchannel. so it's like those cds were prepared for standart 3 second transition area but somewhere late in process it just didn't happen smile . only cds with ring imprint * R1? V have this, so i think it was a bug. this pretty much rules every oddity out i have seen so far to 3s audio-data; 2s data-audio; 0s audio-audio. with sole exception where audio-data gaps were 2s, but that's ok, i guess. so if i wasn't sure about this before, because 2:74 gaps seemed perfectly valid, now i think it's better to just go with 3:00.

I'm sure you will do what's right.. anyway, maybe existing rip tools like turborip can also be used to rip the subchannels, but they will have to be cleaned so there will be no random errors. I'm looking forward to seeing what kind of scan tool you will come up with.

pepsidrinker wrote:

Hey Vigi, does this work with just audio cds? When I get my plextors I want to retry dumping the Jaguar games and it would also be nice to dump properly video game sound tracks and such.

- Works on all discs with data tracks (no audio tracks needed)
- All data track sectors can be used to detect (in the old method it was only possible to use the first track02 pregap sector for this)

So you can't use it soundtracks if they don't have a data track. There has to be a data track for it to work.

[00:32] <Tolvatar> too bad, pce games has data in subchannels
[00:32] <Tolvatar> this dumps are not correct
[00:32] <Tolvatar> i think
[00:33] <Tolvatar> on super cd-rom games, the second track and the last one are similar, only a few bytes differ
[00:33] <Tolvatar> this bytes can be found on subchannels
[00:33] <Tolvatar> and manually change the gaps seems awful
[00:34] <Tolvatar> thanks anyway for the link

any thoughts on this?

Hi.. I'm not sure.. I don't think there's any difference between them except the speed.. just get the one that's the cheapest of the two.. have you tried looking on the plextor site to see if there are any firmware updates for these drives?

If you check this topic: http://forum.redump.org/viewtopic.php?pid=4207#p4207 you will see that gigadeath has a PX-W4824A that does support everything. I'm not sure how it is different from the Plextor PX-W4824TA and what the T stands for.

732

(3 replies, posted in General discussion)

weren't they red or grey dumps?

Eidolon wrote:

Yes I see I have phrased that badly. What I mean is the following: If the Inn database contains intelligent checksums for a particular game, you can test your rip against that. If the checksums are the same, you KNOW that your rip - even if it has a different audio offset - has come from an original CD (i.e. it is a "good" rip which you can keep), or e.g. from a ripped, burnt and re-ripped ISO/MP3 fileset (which is a "bad" rip you should delete immediately ;-).

You'd be surprised to see how many cd rips out there have 1 or 2 audio tracks with errors on them. You'd have to be extremely lucky to find an image somewhere that has a matching intelligent checksum. Still, you're right that the 'intelligent' checksum makes it easier because the offsets are ignored.

Anyway, I'm looking forward to your tool, and it's much appreciated that you also choose to add Redump.org support to the tool instead of just going your own direction. I'm sure that if all the issues are ironed out it can become an easy alternative to the current method.

Good job on your post, only it would be nice if you could explain the following part, because I don't understand what you mean:

However, for any existing rip where the drive read offset is not known, it can be determined if it originally came from an original CD or not! This is a significant advantage over the redump.org database.

Also, I hope CDRWin is capable of detecting the correct gaps on all drives, because I don't see any mention of it in your post. Gaps can be a real pain (at least with EAC).

Also, have you considered writing your tool in C for crossplatform compatibility?

735

(55 replies, posted in General discussion)

gigadeath wrote:

Now the thread can be closed.

You read my thoughts. This discussion is getting out of hand.

I respect Eidolon's/Snake's input to this matter. We should all wait until they come with a tool to postprocess the cdrwin output data before we can do any real comparisons between both methods. I hope they can also let this tool check if the last audio sectors indicate if the data is cut off.

Because there is no such tool yet, my current conclusion is that our method is still the most reliable one, because EAC will warn you when there is a problem and data is cut off. CDRWin doesn't do this, and there is no supplementary tool yet that does. And this has to be automated, because you can't expect people to manually check each disc to see if there's any data cut off. Also, keep in mind that if a drive doesn't support overreading, the dump should at least be verified using 2 drives. I'm not sure if at the end the CDRWin method will be any faster. I'm looking forward to the results.

Topic closed.

736

(55 replies, posted in General discussion)

I do like CDRWin, I've first used it maybe 6 or 7 years ago (if not longer) and I also used it to dump the gdrom discs that are in the database. However, I don't see how CDRWin is any better than other cue/bin tools like IsoBuster, Fireburner, etc., because they all seem to create the same output.. they're all pretty basic and don't have the features that EAC has when it comes to dealing with errors. (Your Burai disc for instance. It doesn't have 100% track quality everywhere, does it still give the same 'intelligent' audio checksum as EAC?)

737

(55 replies, posted in General discussion)

Eidolon wrote:

If I find out that there is an easier way to achieve the same goals, and it can be proven that it works just as reliably, and it even provides the same results and checksums as the REDUMP method (and thus could be used to submit data to both redump, tosec and whatever else) why not pursue it?

TOSEC won't change their dumping method. Why would you care about submitting data to them if almost all of their dumps are flawed? (isn't that what we agreed about?)

Of course it would be nice to have a faster and more reliable dumping method, but I don't see how an obsolete tool like cdrwin (can it even detect proper gaps?) can improve things. Even though I like the results of your test, I would still like to see a time comparison that includes splitting the tracks and creating a proper cuesheet etc. until the output it identical to Redump's. Besides, you will always need a drive that supports certain features (overreading, etc) if you want to be sure that the output is correct. I'm still waiting for the PerfectRip successor myself. In the meanwhile, I'll keep my eyes open for an improved method from you and Snake.

738

(55 replies, posted in General discussion)

In the end it all seems to come down to keeping the first and last audio samples, which is what CDRWin appears to fail at (because there's no offset correction).

@Snake.. I understand from your posts on Eidolon's Inn how offsets aren't really important for your projects, but you have to understand why they are important to us.

First of all it's for documentation. We like the idea of having the audio tracks put back into the position they were before mastering. Even though in the end it only comes down to a few shifted bytes, if you split dumps into separate tracks and store them this way, there can be great benefits to it. For instance for 600mb games with 550mb of identical audio tracks, you can just get the data track and import the audio from another file, without even having to touch a patch tool.

If we would be using TOSEC's method we wouldn't be able to know if a 600 mb game has 550 mb of identical audio in the PAL version compared to the NTSC version (example: http://redump.org/disc/469/ + http://redump.org/disc/475/). Of course you could try to make a patch, but you loose some insight on how these dumps actually differ.

You could just assume that all dumps are different if the disc isn't exactly the same, but at the end that won't help you (at least not if you want to store the cd-images of the dumps somewhere), and here's why: There are plenty of games that only have one difference: the offset. If you would dump these games using TOSEC's method, you would have 2 separate dumps and need 2 dump entries. Using our method there's just 1 dump and 1 entry: http://redump.org/disc/447/ http://redump.org/disc/1777/

If you're planning to include 'intelligent checksums' for both data and audio in your database then I guess you will have the same advantages as our method for games where all audio tracks are identical, but there are also a lot of games where 1 or 2 audio don't match and all the others do. This is why we prefer offset correction and separate tracks. If you don't care about seeing the similarities between two dumps, you can just go ahead with your proposed method, but if you're planning on keeping the cd-images or spreading them around OR if you decide to care about showing people the similarities between dumps, you now know the advantages of our method.

It's not about good or bad. If you actually pay attention to the offsets instead of ignoring them, you are able to remove all the irrelevant differences between a dump and get an ideal dump. Let the offsets help you by separating them from the dump and by storing them as values, not by ignoring that they are there! tongue (the factory write offset is a relevant piece of information about the CD that should be stored in the db).

All tracks have a factory write offset. The only difference is that you don't see them on data tracks, because data isn't read in audio mode (you can still do that and detect the write offset for information purposes using this method: http://forum.redump.org/viewtopic.php?id=2057). The point I'm trying to make here is that what we are really doing here is treating audio the same way the drives are treating data (by removing the offsets that aren't affecting the data track).

Yeah, I don't see any talks about gaps. CDRWIN doesn't split tracks, but can it detect all gaps correctly?

Anyway, nice job on comparing the methods. If you can come up with an easier/faster way of getting identical results as our current method, this would be really nice I guess.

740

(55 replies, posted in General discussion)

chungy wrote:

Does IsoBuster not already re-read problem sectors? I haven't used it on discs with problem sectors yet, but I know that cdrdao in raw-reading mode (btw it can read the subchannel data, if it's of any interest) *will* re-read problem sectors multiple times until it gets a good copy (or correctable, either way...).

It appears that some drives do this and some don't. I know that my Plextor drive does it.

741

(55 replies, posted in General discussion)

Heya Snake.. Welcome to the forums.. I have to say you defended yourself quite well there. Here's my feedback tongue

Snake wrote:

Again, this doesn't apply to SegaCD. The first audio track pregap will contain silence.

Theoretically yes, but there's a chance that data was moved into the pregap when mastering, so if you don't correct the write offset, there's a chance that data is cut off if the pregap isn't included. (We've seen data in the pregap before write offset correction several times by now, just so you know)

Snake wrote:

Out of interest - is there any reason why you guys are NOT doing this?

Separate tracks have proven to be more convenient - at least for our project. There are a lot of games that have the same audio tracks across different regions and versions, and this wouldn't be visible if we would only list combined images. We've considered adding the combined checksums a couple of times, but we never got around doing that.

Anyway, you got our reasons for dumping 2352 - full main channel, same size for all sector types, easier to combine.. You're right that CDMage should be used to check RAW tracks for errors (except maybe when the checksum matches after dumping twice).

742

(55 replies, posted in General discussion)

I think the most important reasons for dumping data tracks in RAW 2352 are:

- no Frankenstein dumps like gigadeath said.. the full main channel is dumped.
- you can easily combine the tracks of a dump that were done using the Redump method into a single image using copy /b. Then you have a proper image of the complete cd with correct gaps and offset. This wouldn't be possible with 2048 tracks without converting them to 2352 first.
- the extra error correction data can be used to verify that the track data is correct (without having to dump the track twice).

I really don't know why 2048 would be better than 2352 (it's smaller yes, but what else?). Maybe you could include both 2352 and 2048 checksums in your database, but this would only take you more time.

743

(55 replies, posted in General discussion)

chungy wrote:

(Technically speaking, redump.org DVD images are all wrong since they don't include raw DVD sectors, which is far more difficult to access and not all DVD drives do it in the same manner (essentially every vendor has their own proprietary commands); who's to say that non-chipped PS2s actually check data that's not in the 2048-byte user data area of a DVD sector?)

You're right, RAW reading DVD's IS possible, but it's very difficult to accomplish: http://x226.org/?p=17 . I think the PS2 reads the DNAS ID in a special way (it should be in the user data area when extracting but it's not). Anyway, there's no point in including the DNAS ID either, because it can be injected after and the images can't be verified when it's included (I think).

If anyone has some info on extracting DVD's 2064 bytes/sector using custom firmware, and what the advantages are, plz let us know smile


Offtopic:

ps. after some google'ing I came across this thread: http://assemblergames.com/forums/showth … p?p=253548
I have a Datel PS2 cd here that has unreadable sectors in the same region, maybe it's possible to extract them in d8 and create a bootable copy big_smile

edit: I managed to extract the sectors and get the same patterns as the other guy, but according to this thread http://club.cdfreaks.com/f52/how-datel- … on-147005/ this data has no real purpose after all

744

(55 replies, posted in General discussion)

daishadar wrote:

Dear lord, as someone who's spent a decent amount of time archiving TOSEC dumps, this information is very disturbing.

If you diff'd a TOSEC ISO dump and a Redump dump, any idea how different the two files would be?  If the difference is very small then it should be possible for someone with a full set of both dumps to create patch files to convert individual TOSEC dumps to Redump dumps.  This way, TOSEC dumps could be merged to more accurate Redump dumps over time.

It's amazing that it is this complex to create 1:1 backups of CDs.  It's just astounding.  Did the original creators of the ISO 9660 standard never see this coming?  From a high level perspective, CDs just store bits- just read all the bits off of it!  smile

In short all data-only dumps like 3DO are easy to convert if you for instance mount the cuesheet in daemon tools and then raw extract.

The problem with adjusting the audio is that you'll have to know the write offset of the original disc. With systems like SNK Neo-Geo CD this is often pretty easy to detect from the image itself, because the audio data always tends to starts at a certain position in the track (for instance if the audio data would start 2 samples after the start it was safe to conclude that the write offset was +2, which is also a common PSX write offset).

Other discs with larger write offsets have a pretty big chance of having missing samples at the start or the end of the audio. When I helped a TOSEC guy to convert the SNK set to the Redump format once we came across several discs that needed audio redumped because there were samples cut off at the start or the end. I think other systems like Saturn will have similar issues. Of course it should be possible to create a patch, but it would only make sense just to convert TOSEC dumps to Redump ones for collecting purposes (and for saving bandwidth).

If a TOSEC dumper should care about the accuracy of his dumps, he could always come here and redump them using the better method. We will never convert/steal any dumps from other projects. I know that some of the TOSEC dumpers are aware of the differences, but they don't seem to care enough about them to start redumping. Maddog from TOSEC once gave me the same explanation as Eidolon did a couple posts before: who really cares about 1/75th of a second?.. I think this thread makes clear that we are the only project atm that DOES care.

themabus wrote:

oh that would be great, even if for a short time while i get my pce cds done

short time? heh.. you're one of the biggest contributors.. I think moderator status is the least we could give you

slightly offtopic: do you also have SNK Neogeo CD discs?

edit: Dremora gave you moderator and added delete function, but I already deleted the dupe PCE for you smile

746

(55 replies, posted in General discussion)

I agree that there's no real point in switching to a poor man's way of dumping cd's just because the other one presumably takes a bit longer.. but I do like the 'smart checksumming' idea that you came up with (checking the data integrity of a cd or image on-the-fly by comparing blocks). However I agree with themabus that without read+write offset correction you will soon run into problems where the start and the end blocks of the audio will have missing data, thus making crc comparison impossible. Also, if I recall correctly, the usual cue/bin tools out there (not sure if this includes cdrwin) don't append the gaps correctly and skip the track02 pregap just like old EAC versions are doing, so the chance of missing samples will be even bigger.

As for scratches and not being able to dump audio correctly: EAC's error detection and rereading mechanism is pretty decent and helped me through a lot of scratched cd's that would be impossible to dump correctly with conventional tools like cdrwin. Also, you forgot that C2 can also be used to detect errors (PerfectRip uses C2 to check for corruption).

themabus wrote:

ok, i resubmited everything +2 new cds but only crcs for data tracks should change
i'll keep checking every cd for that exact data sector number in gaps, not assuming 150 <- scrap that, let's do it perfect smile

woops.. now the db is a mess sad I wish you just could have edited the old entries.. Dremora should definately give you moderator rights and perhaps remove -v- and cHr from this status because they're never around (anymore)

748

(55 replies, posted in General discussion)

There is no such thing as "intelligent checksums". There has to be a standard of how reference checksums are calculated before you can disregard the data offset. We prefer to take both the read and the write offset into account when determining the reference, allowing audio tracks (when saved using the standard) to have identical checksums across different regions/games and not just to look at the data integrity the way you are planning to do. It makes me wonder if the benefit of speed will really be that great, because even a minor small scratch on any of your cd's will give you problems dumping and verifying them the 'fast' way. Sooner or later you will propably end up using EAC after all. Anyway, good luck with your projects.

As for GoodGen, I like No-Intro's dat better, because it's more accurate. Then of course I'm not even talking about MAME and how they want to preserve the Sega Genesis roms (splitting the data into separate files exactly like they are stored on the actual rom chips). Most people also consider this 'pointless' while others don't (see the resemblance?).

Have you set it to Action > 'Append Gaps To Next Track'? And make sure you always do Action > 'Detect Gaps' after inserting a new disc

It shows a gap of 0 seconds instead of 2 for the audio track on the screenshot, so maybe it failed to detect.. Other discs shouldn't have this problem

Are you using the latest version of EAC?

750

(55 replies, posted in General discussion)

Also, the replies I got from the guys at redump.org leads me to the conclusion that it is simply too much hassle to go after "perfect" dumps. There is no such thing as a perfect dump.

Taking Eidolon's quote from his forums, I think he misinterpreted my post. We DO believe that our method creates perfect dumps. Perfect in the sense that every single byte of data on the cd is secured (except the lead-in, lead-out and subchannel data which are irrelevant).

To sum up the differences again between our method and the TOSEC one:

- TOSEC extracts the cooked 2048 data track from a raw image, dropping 304 bytes of information in each data sector that are on the cd (many systems like PSX require the complete main channel, so all 2352 bytes/sector to be ripped, there's no real point in having a 2048 bytes/sector data track for preservation, regardless of what other people say.. audio is 2352 bytes/sector and so is data).
- TOSEC leaves out the track02 pregap, because older EAC versions were not capable of dumping it. This pregap can contain relevant data which will be left out in the resulting dump.
- TOSEC adds a 44-byte RIFF (wav) header to each audio track that isn't on the cd.
- TOSEC corrects the read offset only. Redump corrects both the read and the write offset, allowing audio tracks to be put back into the position they were BEFORE manufacturing. This has proven to be a more senseful way of dealing with audio than just correcting the read offset, because after read+write offset correction we now have audio tracks that start exactly at the intended position (often right after the pregap at byte 352800 for instance) and that have matching checksums across different regions for discs with different write offsets (these tracks would have different checksums using the TOSEC method). Some examples: http://redump.org/disc/1777/ http://redump.org/disc/447/ , all Mortal Kombat Trilogy versions, and lots more.

If you still think it's too much work for you (even with PerfectRip) then that's alright. I just wanted to explain why we believe that our method IS perfect. Perfect in the sense that the full contents of the cd are preserved the best possible way. There is still room for improvement for the speed and difficulty of the dumping process, but the output that is achieved is final.