2,701

Thanks works! https://drive.google.com/file/d/1yv3qtg … sp=sharing

All my posts and submission data are released into Public Domain / CC0.

2,702

reentrant wrote:

Anyway I have some good news. After some coding and experimenting with F1 command I can enable reading more sectors from cache.

Great works!!

Added your code and some changed. Test Plz. http://www.mediafire.com/file/eq80y20l9 … st.7z/file

2,703 (edited by reentrant 2020-11-07 01:53:52)

Sth is not right. Cache:

19 Cache LBA 325224 (MSF 72:18:24) MODE 01
20 Cache LBA 325225 (MSF 72:18:25) MODE 01
21 Cache LBA 325226 (MSF 72:18:26) MODE 01
22 Cache LBA 325227 (MSF 72:18:27) MODE 01
23 Cache LBA 325228 (MSF 72:18:28) MODE 01 Lead-out
24 Cache LBA 325229 (MSF 72:18:29) MODE 01 Lead-out
25 Cache LBA 325230 (MSF 72:18:30) MODE 01 Lead-out
26 Cache LBA 325231 (MSF 72:18:31) MODE 01 Lead-out
27 Cache LBA 325232 (MSF 72:18:32) MODE 01 Lead-out
28 Cache LBA 325233 (MSF 72:18:33) MODE 01 Lead-out
29 Cache LBA 325234 (MSF 72:18:34) MODE 01 Lead-out

Image has following LBAs:
LBA[325222, 0x4f666], MSF[72:18:22], mode 1
LBA[325223, 0x4f667], MSF[72:18:23], mode 1
LBA[325224, 0x4f668], MSF[72:18:24], mode 1
LBA[325225, 0x4f669], MSF[72:18:25], mode 1 User data vs. ecc/edc doesn't match
LBA[325228, 0x4f66c], MSF[72:18:28], mode 1
LBA[325229, 0x4f66d], MSF[72:18:29], mode 1

325226 and 325227 are missing.
325228 and 325229 are lead out sectors.

It looks like data is copied from wrong cache sectors.

2,704

reentrant wrote:

It looks like data is copied from wrong cache sectors.

I forgot to set the offset.
http://www.mediafire.com/file/eq80y20l9 … st.7z/file

2,705

This is some kind of mastering error? I think we've seen it a couple times before with CDi.

logs: https://drive.google.com/file/d/1V_C0e_ … sp=sharing

LBA[136249, 0x21439]: Track[01]: This sector is data, but sync is invalid
========== LBA[136249, 0x21439]: Main Channel ==========
       +0 +1 +2 +3 +4 +5 +6 +7  +8 +9 +A +B +C +D +E +F
0000 : 20 20 20 20 20 20 20 20  20 20 20 20 20 20 20 20                   
LBA[136250, 0x2143a]: Track[01]: This sector is data, but sync is invalid
========== LBA[136250, 0x2143a]: Main Channel ==========
       +0 +1 +2 +3 +4 +5 +6 +7  +8 +9 +A +B +C +D +E +F
0000 : 20 20 20 20 20 20 20 20  20 20 20 20 20 20 20 20                   
LBA[136251, 0x2143b]: Track[01]: This sector is data, but sync is invalid
========== LBA[136251, 0x2143b]: Main Channel ==========
       +0 +1 +2 +3 +4 +5 +6 +7  +8 +9 +A +B +C +D +E +F
0000 : 20 20 20 20 20 20 20 20  20 20 20 20 20 20 20 20                   
LBA[136252, 0x2143c]: Track[01]: This sector is data, but sync is invalid
========== LBA[136252, 0x2143c]: Main Channel ==========
       +0 +1 +2 +3 +4 +5 +6 +7  +8 +9 +A +B +C +D +E +F
0000 : 20 20 20 20 20 20 20 20  20 20 20 20 20 20 20 20                   
LBA[136253, 0x2143d]: Track[01]: This sector is data, but sync is invalid
========== LBA[136253, 0x2143d]: Main Channel ==========
       +0 +1 +2 +3 +4 +5 +6 +7  +8 +9 +A +B +C +D +E +F
0000 : 20 20 20 20 20 20 20 20  20 20 20 20 20 20 20 20                   
LBA[136254, 0x2143e]: Track[01]: This sector is data, but sync is invalid
========== LBA[136254, 0x2143e]: Main Channel ==========
       +0 +1 +2 +3 +4 +5 +6 +7  +8 +9 +A +B +C +D +E +F
0000 : 00 00 00 00 00 00 00 00  00 00 00 00 00 00 00 00   ................
LBA[136249, 0x21439]: Track[01]: Invalid sync. Skip descrambling
========== LBA[136249, 0x21439]: Main Channel ==========
       +0 +1 +2 +3 +4 +5 +6 +7  +8 +9 +A +B +C +D +E +F
0000 : 20 20 20 20 20 20 20 20  20 20 20 20 20 20 20 20                   
0010 : 20 20 20 20 20 20 20 20  20 20 20 20 20 20 20 20                   
0020 : 20 20 20 20 20 20 20 20  20 20 20 20 20 20 20 20                   
0030 : 20 20 20 20 20 20 20 20  20 20 20 20 20 20 20 20                   
0040 : 20 20 20 20 20 20 20 20  20 20 20 20 20 20 20 20                   
0050 : 20 20 20 20 20 20 20 20  20 20 20 20 20 20 20 20                   
0060 : 20 20 20 20 20 20 20 20  20 20 20 20 20 20 20 20     ...
All my posts and submission data are released into Public Domain / CC0.

2,706

user7 wrote:

This is some kind of mastering error?

Maybe so.

2,707

sarami wrote:
reentrant wrote:

Anyway I have some good news. After some coding and experimenting with F1 command I can enable reading more sectors from cache.

Great works!!

Added your code and some changed. Test Plz. http://www.mediafire.com/file/eq80y20l9 … st.7z/file

I'm not a coder, but it seems like you are trying to combine sectors from cache reads based on the assumption that they are all data sectors with correct sync/header? How will it handle audio sectors or mastering errors safely?

2,708

Third times the charm? Quake101 https://mega.nz/file/fqgXTA4D#MH0KrCvhl … 2ab9rjcR5I

sarami wrote:
reentrant wrote:

It looks like data is copied from wrong cache sectors.

I forgot to set the offset.
http://www.mediafire.com/file/eq80y20l9 … st.7z/file


Still sth not right:

LBA[325218, 0x4f662], MSF[72:18:18], mode 1
LBA[325219, 0x4f663], MSF[72:18:19], mode 1
LBA[325220, 0x4f664], MSF[72:18:20], mode 1
LBA[325221, 0x4f665], MSF[72:18:21], mode 1
LBA[325222, 0x4f666], MSF[72:18:22], mode 1
LBA[325223, 0x4f667], MSF[72:18:23], mode 1 User data vs. ecc/edc doesn't match
LBA[325226, 0x4f66a], MSF[72:18:26], mode 1
LBA[325227, 0x4f66b], MSF[72:18:27], mode 1
LBA[325228, 0x4f66c], MSF[72:18:28], mode 1
LBA[325229, 0x4f66d], MSF[72:18:29], mode 1
[ERROR] Number of sector(s) where user data doesn't match the expected ECC/EDC: 1

2,710

reentrant wrote:

Still sth not right:

Sorry, I couldn't test well because I thought I didn't have a big offset disc. But found it in my room. (Free Software Collection 6 [FMT])
Please wait until fix the bug.

Ok. A small suggestion: it's better to read two sectors from cache on each F1 call and scan only first 0x900 bytes for 0x00 0xFF.. 0xFF 0x00 pattern because there could be a case where the sector header starts at offset: 0x8FF. If you read only 0x900 bytes such header won't be found.

2,712

Uploaded.
http://www.mediafire.com/file/eq80y20l9 … st.7z/file

reentrant wrote:

A small suggestion

Added.

Jackal wrote:

will it handle audio sectors

Added.

Jackal wrote:

mastering errors safely

I don't test yet.

Agent47 wrote:

Third times the charm?

Thanks. It works as I expected.

2,713 (edited by reentrant 2020-11-08 11:34:18)

Still sth not right. Under debugger I see that ReadCDForCheckingReadInOut returned FALSE and exception was thrown. Only cache was printed on console.

2,714 (edited by Parotaku 2020-11-08 11:43:25)

Hello sarami

For some time now, I noticed that when I dump something, using the same drive as before & recent DIC releases, everything works OK but after the dump is over ('success' tune has been heard) I got :

All submission information gathered!
Error! Please check output directory as dump may be incomplete!

and the !submissionInfo.txt log is missing...

I don't understand why as everything looks OK...
Here's a test dump for 'Castlevania chronicle' on jp PS1 (but I got the same error whatever format I dump)...
https://www.dropbox.com/s/5mg4gl1fllyyu … LA.7z?dl=0

Could you have a look and see if you can find the cause of that error message?

2,715 (edited by sarami 2020-11-08 12:59:33)

reentrant wrote:

Still sth not right.

Updated test branch on github. please debug.

Could you have a look and see if you can find the cause of that error message?

It seems no error.

2,716 (edited by reentrant 2020-11-08 14:17:04)

Problem is here:

INT tmpLBA = MSFtoLBA(md, sd, fd) - 150;
if (tmpLBA == nLineNum ||
    (pDisc->SCSI.toc.TrackData[0].Control & AUDIO_DATA_TRACK) == 0) {

tmpLBA = 325226
nLineNum = 21

In case of data track code never goes inside 'if'

And in my case nLBA i always bigger than tmpLBA by 2.

It looks like the data in cache has some offset.

I hacked it this way:

if (nLBA >= pDisc->SCSI.nAllLength) {
        for (DWORD x = 0; x < CD_RAW_SECTOR_SIZE * 16 - 16; ++x) {
            if (IsValidMainDataHeader(aMainBuf + x)) {
                bHeader = TRUE;
                BYTE m = (BYTE)(aMainBuf[x + 0x0c] ^ 0x01);
                BYTE s = (BYTE)(aMainBuf[x + 0x0d] ^ 0x80);
                BYTE f = (BYTE)(aMainBuf[x + 0x0e]);
                BYTE mode = (BYTE)(aMainBuf[x + 0xf] ^ 0x60);

                BYTE md = BcdToDec(m);
                BYTE sd = BcdToDec(s);
                BYTE fd = BcdToDec(f);
                INT tmpLBA = MSFtoLBA(md, sd, fd) - 150;
                if (tmpLBA == nLBA ||
                    (pDisc->SCSI.toc.TrackData[0].Control & AUDIO_DATA_TRACK) == 0) {
                    OutputLog(standardOut | fileDisc,
                        "-----------------------------------------------------\n"
                        "Cache SIZE: %u (This size is different every running)\n"
                        "-----------------------------------------------------\n"
                        , nLineNum
                    );
                    *lpbCached = TRUE;

                    break;
                }
            }
        }

Read 16 sectors from cache. On 3rd iteration (offset 2) it entered 'if'

EDIT:
For audio maybe parsing subs would be better?

2,717 (edited by Parotaku 2020-11-08 14:44:56)

It seems no error.

Hmm, strange...
do you know anything which could prevent !submissionInfo.txt to be created?

2,718

Parotaku wrote:

It seems no error.

Hmm, strange...
do you know anything which could prevent !submissionInfo.txt to be created?

Sarami does not make dicui, only dic.

All my posts and submission data are released into Public Domain / CC0.

2,719

reentrant wrote:

I hacked it this way:

Btw, when sync is detected, how many is value of 'x'?

x = 6468 (combined offset * 4)

========== Offset (Drive offset referes to http://www.accuraterip.com) ==========
     Combined Offset(Byte)   6468, (Samples)  1617
    -   Drive Offset(Byte)     24, (Samples)     6
    ----------------------------------------------
           CD Offset(Byte)   6444, (Samples)  1611

2,721

Uploaded. http://www.mediafire.com/file/eq80y20l9 … st.7z/file
And pushed the src in my github test branch.

2,722 (edited by reentrant 2020-11-09 19:12:10)

Wow, works nice now.

Offsets: 2000+
Data track: ok
Audio track: ok

I don't have disc with big offset and mastering error so I cannot test. Is there a way to search for big offset disc and mastering error in comments?

http://redump.org/disc/27483/ - Nexy disc. Maybe he can check it?

2,723 (edited by ehw 2020-11-11 06:16:26)

Hello sarami,

I'm having issues dumping a PS1 proto burnt on a Philips CD-R from the 90s. I keep getting C2 errors pretty much around the same area no matter which drive I try, somewhere torward the edge of the disc. I could create a dump using CloneCD which doesn't report issues with the recommended settings. I dumped the disc twice with Redump's old recommended CloneCD profile and the .img has the same checksum every time. I was able to somehow make a dump with DIC with my LG WH14NS40 with Asus BW-16D1HT 3.02 firmware flashed but the checksums on the .img didn't match CloneCD. There were definitely many errors that were reported by DIC with the LG drive. The surface of the disc is spotless with very little on it. I have another Philips CD-R with a slightly newer build of the same game (I believe it matches the PAL final exactly) and it also spawns C2 errors. This kind of makes me believe this could be some kind of mastering issue?

Here are some logs:

PX-4012:
http://www.mediafire.com/file/pwp45r62g … ).zip/file

PX-760a:
http://www.mediafire.com/file/smadk5sts … ).zip/file

WH14NS40 (flashed with ASUS):
http://www.mediafire.com/file/3hnb026uc … ).zip/file

I used DIC version 20200921 via the latest DICUI. I tried setting the C2 reread to 1000 and it still couldn't get past the first C2 error.

By the way, I've been meaning to ask, are there recommended command parameters/settings to handle CD-Rs? CD-Rs in general are always tricky to dump with DIC and was curious if you could recommend a setting that could work for most of them?

2,724

ehw wrote:

are there recommended command parameters/settings to handle CD-Rs?

No.

2,725 (edited by user7 2020-11-13 06:04:25)

Try dumping 8x speed.
If CloneCD is outputting a reliable hash may be a dic bug. Then again CloneCD could be outputting the same errors over and over, worth checking with edccchk.

All my posts and submission data are released into Public Domain / CC0.