676

(55 replies, posted in General discussion)

chungy wrote:

Does IsoBuster not already re-read problem sectors? I haven't used it on discs with problem sectors yet, but I know that cdrdao in raw-reading mode (btw it can read the subchannel data, if it's of any interest) *will* re-read problem sectors multiple times until it gets a good copy (or correctable, either way...).

It appears that some drives do this and some don't. I know that my Plextor drive does it.

677

(55 replies, posted in General discussion)

Heya Snake.. Welcome to the forums.. I have to say you defended yourself quite well there. Here's my feedback tongue

Snake wrote:

Again, this doesn't apply to SegaCD. The first audio track pregap will contain silence.

Theoretically yes, but there's a chance that data was moved into the pregap when mastering, so if you don't correct the write offset, there's a chance that data is cut off if the pregap isn't included. (We've seen data in the pregap before write offset correction several times by now, just so you know)

Snake wrote:

Out of interest - is there any reason why you guys are NOT doing this?

Separate tracks have proven to be more convenient - at least for our project. There are a lot of games that have the same audio tracks across different regions and versions, and this wouldn't be visible if we would only list combined images. We've considered adding the combined checksums a couple of times, but we never got around doing that.

Anyway, you got our reasons for dumping 2352 - full main channel, same size for all sector types, easier to combine.. You're right that CDMage should be used to check RAW tracks for errors (except maybe when the checksum matches after dumping twice).

678

(55 replies, posted in General discussion)

I think the most important reasons for dumping data tracks in RAW 2352 are:

- no Frankenstein dumps like gigadeath said.. the full main channel is dumped.
- you can easily combine the tracks of a dump that were done using the Redump method into a single image using copy /b. Then you have a proper image of the complete cd with correct gaps and offset. This wouldn't be possible with 2048 tracks without converting them to 2352 first.
- the extra error correction data can be used to verify that the track data is correct (without having to dump the track twice).

I really don't know why 2048 would be better than 2352 (it's smaller yes, but what else?). Maybe you could include both 2352 and 2048 checksums in your database, but this would only take you more time.

679

(55 replies, posted in General discussion)

chungy wrote:

(Technically speaking, redump.org DVD images are all wrong since they don't include raw DVD sectors, which is far more difficult to access and not all DVD drives do it in the same manner (essentially every vendor has their own proprietary commands); who's to say that non-chipped PS2s actually check data that's not in the 2048-byte user data area of a DVD sector?)

You're right, RAW reading DVD's IS possible, but it's very difficult to accomplish: http://x226.org/?p=17 . I think the PS2 reads the DNAS ID in a special way (it should be in the user data area when extracting but it's not). Anyway, there's no point in including the DNAS ID either, because it can be injected after and the images can't be verified when it's included (I think).

If anyone has some info on extracting DVD's 2064 bytes/sector using custom firmware, and what the advantages are, plz let us know smile


Offtopic:

ps. after some google'ing I came across this thread: http://assemblergames.com/forums/showth … p?p=253548
I have a Datel PS2 cd here that has unreadable sectors in the same region, maybe it's possible to extract them in d8 and create a bootable copy big_smile

edit: I managed to extract the sectors and get the same patterns as the other guy, but according to this thread http://club.cdfreaks.com/f52/how-datel- … on-147005/ this data has no real purpose after all

680

(55 replies, posted in General discussion)

daishadar wrote:

Dear lord, as someone who's spent a decent amount of time archiving TOSEC dumps, this information is very disturbing.

If you diff'd a TOSEC ISO dump and a Redump dump, any idea how different the two files would be?  If the difference is very small then it should be possible for someone with a full set of both dumps to create patch files to convert individual TOSEC dumps to Redump dumps.  This way, TOSEC dumps could be merged to more accurate Redump dumps over time.

It's amazing that it is this complex to create 1:1 backups of CDs.  It's just astounding.  Did the original creators of the ISO 9660 standard never see this coming?  From a high level perspective, CDs just store bits- just read all the bits off of it!  smile

In short all data-only dumps like 3DO are easy to convert if you for instance mount the cuesheet in daemon tools and then raw extract.

The problem with adjusting the audio is that you'll have to know the write offset of the original disc. With systems like SNK Neo-Geo CD this is often pretty easy to detect from the image itself, because the audio data always tends to starts at a certain position in the track (for instance if the audio data would start 2 samples after the start it was safe to conclude that the write offset was +2, which is also a common PSX write offset).

Other discs with larger write offsets have a pretty big chance of having missing samples at the start or the end of the audio. When I helped a TOSEC guy to convert the SNK set to the Redump format once we came across several discs that needed audio redumped because there were samples cut off at the start or the end. I think other systems like Saturn will have similar issues. Of course it should be possible to create a patch, but it would only make sense just to convert TOSEC dumps to Redump ones for collecting purposes (and for saving bandwidth).

If a TOSEC dumper should care about the accuracy of his dumps, he could always come here and redump them using the better method. We will never convert/steal any dumps from other projects. I know that some of the TOSEC dumpers are aware of the differences, but they don't seem to care enough about them to start redumping. Maddog from TOSEC once gave me the same explanation as Eidolon did a couple posts before: who really cares about 1/75th of a second?.. I think this thread makes clear that we are the only project atm that DOES care.

themabus wrote:

oh that would be great, even if for a short time while i get my pce cds done

short time? heh.. you're one of the biggest contributors.. I think moderator status is the least we could give you

slightly offtopic: do you also have SNK Neogeo CD discs?

edit: Dremora gave you moderator and added delete function, but I already deleted the dupe PCE for you smile

682

(55 replies, posted in General discussion)

I agree that there's no real point in switching to a poor man's way of dumping cd's just because the other one presumably takes a bit longer.. but I do like the 'smart checksumming' idea that you came up with (checking the data integrity of a cd or image on-the-fly by comparing blocks). However I agree with themabus that without read+write offset correction you will soon run into problems where the start and the end blocks of the audio will have missing data, thus making crc comparison impossible. Also, if I recall correctly, the usual cue/bin tools out there (not sure if this includes cdrwin) don't append the gaps correctly and skip the track02 pregap just like old EAC versions are doing, so the chance of missing samples will be even bigger.

As for scratches and not being able to dump audio correctly: EAC's error detection and rereading mechanism is pretty decent and helped me through a lot of scratched cd's that would be impossible to dump correctly with conventional tools like cdrwin. Also, you forgot that C2 can also be used to detect errors (PerfectRip uses C2 to check for corruption).

themabus wrote:

ok, i resubmited everything +2 new cds but only crcs for data tracks should change
i'll keep checking every cd for that exact data sector number in gaps, not assuming 150 <- scrap that, let's do it perfect smile

woops.. now the db is a mess sad I wish you just could have edited the old entries.. Dremora should definately give you moderator rights and perhaps remove -v- and cHr from this status because they're never around (anymore)

684

(55 replies, posted in General discussion)

There is no such thing as "intelligent checksums". There has to be a standard of how reference checksums are calculated before you can disregard the data offset. We prefer to take both the read and the write offset into account when determining the reference, allowing audio tracks (when saved using the standard) to have identical checksums across different regions/games and not just to look at the data integrity the way you are planning to do. It makes me wonder if the benefit of speed will really be that great, because even a minor small scratch on any of your cd's will give you problems dumping and verifying them the 'fast' way. Sooner or later you will propably end up using EAC after all. Anyway, good luck with your projects.

As for GoodGen, I like No-Intro's dat better, because it's more accurate. Then of course I'm not even talking about MAME and how they want to preserve the Sega Genesis roms (splitting the data into separate files exactly like they are stored on the actual rom chips). Most people also consider this 'pointless' while others don't (see the resemblance?).

Have you set it to Action > 'Append Gaps To Next Track'? And make sure you always do Action > 'Detect Gaps' after inserting a new disc

It shows a gap of 0 seconds instead of 2 for the audio track on the screenshot, so maybe it failed to detect.. Other discs shouldn't have this problem

Are you using the latest version of EAC?

686

(55 replies, posted in General discussion)

Also, the replies I got from the guys at redump.org leads me to the conclusion that it is simply too much hassle to go after "perfect" dumps. There is no such thing as a perfect dump.

Taking Eidolon's quote from his forums, I think he misinterpreted my post. We DO believe that our method creates perfect dumps. Perfect in the sense that every single byte of data on the cd is secured (except the lead-in, lead-out and subchannel data which are irrelevant).

To sum up the differences again between our method and the TOSEC one:

- TOSEC extracts the cooked 2048 data track from a raw image, dropping 304 bytes of information in each data sector that are on the cd (many systems like PSX require the complete main channel, so all 2352 bytes/sector to be ripped, there's no real point in having a 2048 bytes/sector data track for preservation, regardless of what other people say.. audio is 2352 bytes/sector and so is data).
- TOSEC leaves out the track02 pregap, because older EAC versions were not capable of dumping it. This pregap can contain relevant data which will be left out in the resulting dump.
- TOSEC adds a 44-byte RIFF (wav) header to each audio track that isn't on the cd.
- TOSEC corrects the read offset only. Redump corrects both the read and the write offset, allowing audio tracks to be put back into the position they were BEFORE manufacturing. This has proven to be a more senseful way of dealing with audio than just correcting the read offset, because after read+write offset correction we now have audio tracks that start exactly at the intended position (often right after the pregap at byte 352800 for instance) and that have matching checksums across different regions for discs with different write offsets (these tracks would have different checksums using the TOSEC method). Some examples: http://redump.org/disc/1777/ http://redump.org/disc/447/ , all Mortal Kombat Trilogy versions, and lots more.

If you still think it's too much work for you (even with PerfectRip) then that's alright. I just wanted to explain why we believe that our method IS perfect. Perfect in the sense that the full contents of the cd are preserved the best possible way. There is still room for improvement for the speed and difficulty of the dumping process, but the output that is achieved is final.

687

(55 replies, posted in General discussion)

Welcome

We've tried working together with them in the past, because we too felt it would be better to have one database for all dumps. TOSEC however didn't want to give up on their dumping method, because redumping all their discs using our new method (raw data tracks, write offset correction on audio, no riff headers added to audio) wouldn't be possible.. (we were told that a lot of the discs that were dumped could not be redumped anymore because they were already sold again etc..) Also there have been some differences in opinions on what's the best way to dump discs, which lead to some arguments between certain members.

With that said, a decent amount of people who dumped for TOSEC have also dumped PSX games for our project. We've added new systems over the past months not to try and replicate what TOSEC is doing, but because there were demands of people who wanted to dump the discs with what they believe is the best method. This is because CD dumps from TOSEC (except the data-only ones like 3DO, of which the error correct data can be generated) sometimes have missing samples in the first and the last audio tracks (that couldn't be dumped due to EAC limitations and lack of write offset correction.. TOSEC doesn't think of this as a big deal but we do, because we want perfect 1:1 dumps, even if it's just a few bytes).

I'd still recommend people with Dreamcast discs to team up with TOSEC for that, because they're better organized at that (we have 4 dumps and they have hundreds). The only different for that system is the write offset correction, which in the end only comes down to some shifted bytes.

As for automated tools, the only one we have so far is PerfectRip. The version that we are testing was abandoned and the new version won't be out for aprox. 6 months. Anyway the version that we are testing works pretty good on Plextor drives, so if you have a plextor drive I could send you a test version (unfortunately other brands don't seem to work that well).

It's not much use comparing the two dumping methods (Redump vs TOSEC), because (as harsh as it may sound) we consider their method to be obsolete. We started out with a method that was almost identical to TOSEC's. Then soon after we discovered how to detect the write offset of a disc and how to correct is. The current method is final, even though there's still room for improvement in the speed area. Once you get used to it it will only take a couple of minutes for each disc to dump (unlike EAC, PerfectRip can automatically rip data and audio tracks with the correct gaps).

ps. for SegaCD dumps we work together with the no-intro project - http://gbadat.altervista.org/

ps.2 A google search for redump.org brought up this topic: http://forums.segaxtreme.net/showthread … amp;page=2 so I assume that's where you came from smile if you you have any more questions let us know

688

(4 replies, posted in Fixes & additions)

gigadeath wrote:
pepsidrinker wrote:

Sorry, these games I dumped before I knew about cue and mode1 and that, sorry. Please change it from Mode 2 to Mode 1, thanks again I apologize.

Yeah PC games can be either Mode 1 or 2.

You have to put into the database the exact cue you get from actual CD. Did you submit the same cuesheet for every disc?

Obviously he didn't tongue

689

(4 replies, posted in Fixes & additions)

IBM isn't always mode1, so I hope you checked all of them

and what about this one?: http://redump.org/disc/2072/

690

(2 replies, posted in Fixes & additions)

Thanks, but the volume serial number is only needed if region, version etc are the same.. so the time I asked you for it was an exception

themabus wrote:

oh, i see smile

but can you make a sugesstion, then? it's just that i thought about all of this and i guess there is no perfect way - machine would still make a wrong assumption at some circumstances, no matter how good algorithm is. so if they could make an interface to allow users to redefine gap layout in PerfectRip, this would solve a lot.

the delphi version that we are using isn't actively worked on anymore.. I'll pass it on as a suggestion for the future version

themabus wrote:
Vigi wrote:

We could always wait for the perfectrip successor and hope it will work on all drives

but i thought you're one of the developers, aren't you?

lol no way, I was just the one handing out all the beta's..

We could always wait for the perfectrip successor and hope it will work on all drives

I made this green because it matches the checksum of the Return.to.Castle.Wolfenstein.Game.of.the.Year.Edition-CURE warez release..
same for soldier of fortune platinum (it matches Soldier_Of_Fortune_Platinum_Edition-iVEiSO)

http://redump.org/disc/2257/

hey, did you forget to remove the pregap at the end? because the only difference between the original version entry is 150 sectors of the data track: http://redump.org/disc/1573/

696

(1 replies, posted in Fixes & additions)

That may be.. but it has to be part of the filename, because it's different from the normal version (it's password-protected).. the edition field doesn't add anything to the filename

697

(5 replies, posted in News)

Merry Christmas everybody!

ps. the database is nearing 2000 dumps. A special thanks to all the people who made this possible. We hope 2008 will be a good year again for preservation!

698

(2 replies, posted in Fixes & additions)

I added them the way they were submitted.. if it's incorrect you can edit them yourself

ssjkakaroto wrote:

I don't know if I'm doing this correctly but here's my result:
Sector 1: 18201
Sector 0: 18200
Sector -1: 18174
Sector -2: 18173 <- The one I got with px_d8
...
Sector -140: 18010
Sector -141: Error, LBA out of range

Is there really a 2 sector difference making the factory offset 1176? And why could I go up to -140?

there's 150 pregap sectors before the start of the data track, but they aren't dumped (because they don't contain any real data and because most drives can't read the entire pregap)

was 'apply YB scrambling' enabled in cdreader?.. and have you tried checking the offset the old way (does it have audio tracks) to see if it gives the same output of +1176?

I still can't be sure about the write offset, because +0 would make more sense than +1176 (is the accuraterip offset for your drive really +98?)

ssjkakaroto wrote:

Thanks Vigi, I don't know why I multiplied 24*8 instead of 16 hmm
But what about that 18173 thing that themabus mentioned?

You'll have to see for yourself if there's really a 2 sectors difference with your drive+disc:

Vigi wrote:

The best way to figure out the sector correction for such discs is by using Truong's cdreader tool: http://www.cdtool.pwp.blueyonder.co.uk/ … 1_2b20.zip .

Use 'View Sectors' to go to the first sector of the data track, then enable the 'Apply YB scrambling' box. This will scramble the header (the sync/header is now the same as the px_d8 output). Then you can determine the offset in sectors by looking for the sector with the same sync/header in cdreader.