2,976

Hi Sarami.

now that Unlicensed PS2 is working properly.

i tried to verify this entry:
http://redump.org/disc/74019/

SuperG's dump Error count = 1036
My Verification - Error Count = 518

1036 / 518 = 2 .   double counting errors?


Logs:
https://mega.nz/file/2otxmSgC#cYpElrYYg … ccd3XfyXMo

2,977 (edited by user7 2021-08-22 03:27:28)

Is there anyway to prevent dic from hashing .scm and .img to save time? is it necessary?

Thanks <3<3

All my posts and submission data are released into Public Domain / CC0.

2,978

user7 wrote:

Is there anyway to prevent dic from hashing .scm and .img to save time? is it necessary?

Thanks <3<3

Agreed. This would be awesome for those of us stuck dumping with older hardware.

2,979 (edited by user7 2021-08-22 05:18:08)

sarami is it possible to make the latest dic test build a static link on GitHub (not Mediafire)? Darksabre wants to implement a "Update to latest test build" button in MPF to encourage more people to give you timely feedback - and having a static link directly to the latest test build file would allow him to do that. Many people still dump with old dic builds in DICUI/MPF and this would ensure new dumps are more likely to be using the latest dic.

All my posts and submission data are released into Public Domain / CC0.

2,980

wiggy2k wrote:
user7 wrote:

Is there anyway to prevent dic from hashing .scm and .img to save time? is it necessary?

Thanks <3<3

Agreed. This would be awesome for those of us stuck dumping with older hardware.

http://forum.redump.org/post/57032/#p57032

user7 wrote:

is it possible to make the latest dic test build a static link on GitHub (not Mediafire)?

Why Mediafire's link is bad?

2,981 (edited by user7 2021-08-22 16:08:05)

sarami wrote:
wiggy2k wrote:
user7 wrote:

Is there anyway to prevent dic from hashing .scm and .img to save time? is it necessary?

Thanks <3<3

Agreed. This would be awesome for those of us stuck dumping with older hardware.

http://forum.redump.org/post/57032/#p57032

I hereby veto Fireball. It's just a waste of time for everyone dumping a lot of discs and not worth adding just because someone thought the feature would be neat imo.


sarami wrote:
user7 wrote:

is it possible to make the latest dic test build a static link on GitHub (not Mediafire)?

Why Mediafire's link is bad?

Because it requires going through a website interface rather than a direct file link which could be pulled down via an automated process. MPF could add a "push button to automatically update to latest test build" feature in it. Then more people can use your test version and give you feedback.

Please please tyty <3 <3

All my posts and submission data are released into Public Domain / CC0.

2,982

Agreed RE hashing scm/img: we don't store such hashes on the site, they are pretty much useless and just a waste of time.

PX-4824TA (offset +98), PX-755SA (offset +30), ASUS BW-16D1HT (offset +6)

2,983

user7 wrote:

it requires going through a website interface rather than a direct file link which could be pulled down via an automated process.

I tried AppVeyor but I have no idea how to build by resolving the dependency of the external library.

2,984

sarami, DiscImageCreator has problems properly dumping a Hasbro VideoNow disc.

As noted by F1ReB4LL, it's supposed to have 18032 bytes of zeroes + 71808 bytes that were cut at the start of the first track. This seems to be a DIC problem.

My logs are below.

https://cdn.discordapp.com/attachments/ … s_logs.zip

My Drives: Plextor PX-W5224TA (white, with Syba 5.25" HDD enclosure), ASUS BW-16D1HT (with OWC Mercury Pro enclosure), Plextor Premium-U, Plextor PX-760A (with ByteCC enclosure), Lite-On LH-20A1HX, Kreon TS-H352C, Plextor PX-891SAF, Wii, Wii U

Other Hardware: JVC HR-XVC27U (DVD/VHS VCR), Sony CFD-S50 (CD/Cassette/Radio Boombox), Canon LiDE110 (Scanner)

2,985

user7 wrote:

Is there anyway to prevent dic from hashing .scm and .img to save time? is it necessary?

Thanks <3<3

Absolutely necessary. Get a new PC if yours can't do fast calculations. Since we're using the descrambled images for redump checksums, the .scm checksum is the only evidence of the original data. And the .img checksum is important to compare with the calculated-by-the-site one + needed for the quick db-vs-log verifications.

2,986

F1ReB4LL wrote:

Since we're using the descrambled images for redump checksums, the .scm checksum is the only evidence of the original data.

Do we need such evidence? Besides, the descrambling process is 100% reversible and independent from actually reading the data from the drive.

F1ReB4LL wrote:

And the .img checksum is important to compare with the calculated-by-the-site one + needed for the quick db-vs-log verifications.

Then just calculate the CRC32 (the same way that the site does) and skip the useless MD5 and SHA1.

PX-4824TA (offset +98), PX-755SA (offset +30), ASUS BW-16D1HT (offset +6)

2,987 (edited by user7 2021-08-23 05:37:55)

F1ReB4LL wrote:

Absolutely necessary. Get a new PC if yours can't do fast calculations.

Says the guy who has undumped discs marked on the wiki for years...

No sympathy for people that dump stacks of discs at a time I see. I've got a good build bro, thanks.

All my posts and submission data are released into Public Domain / CC0.

2,988

Larsenv wrote:

sarami, DiscImageCreator has problems properly dumping a Hasbro VideoNow disc.

As noted by F1ReB4LL, it's supposed to have 18032 bytes of zeroes + 71808 bytes that were cut at the start of the first track. This seems to be a DIC problem.

Use this.

        /vn     Search specific bytes
                        For VideoNow
                        val     Combined offset is shifted for negative direction if positive value is set
        /vnc    Search specific bytes
                        For VideoNow Color
        /vnx    Search specific bytes
                        For VideoNow XP

2,989

user7 wrote:

Says the guy who has undumped discs marked on the wiki for years...

Huh? I've dumped the "stacked" saturn discs years ago. And what's the connection between "stacking" the discs and importance of having the scrambled checksums? We have lots of discs descrambled in unusual ways, lots of discs with mastering errors, if they don't have logs with .scm checksums - they will be nuked sooner or later.

user7 wrote:

I've got a good build bro, thanks.

I will personally nuke all the dumps made with unofficial/hacked tools, we don't need them. We don't accept the dumps from non-plextor drives, why should we accept dumps from some shady builds?

What's the problem to make a 4GB RAMdisk and do all the dumping stuff there? All the calculations will be instant.

2,990

F1ReB4LL wrote:
user7 wrote:

Says the guy who has undumped discs marked on the wiki for years...

Huh? I've dumped the "stacked" saturn discs years ago. And what's the connection between "stacking" the discs and importance of having the scrambled checksums? We have lots of discs descrambled in unusual ways, lots of discs with mastering errors, if they don't have logs with .scm checksums - they will be nuked sooner or later.

You don't dump large amounts of discs. I don't think you even dump discs any more bro. So yeah, just because you have neurosis doesn't mean other people want their time wasted.

F1ReB4LL wrote:
user7 wrote:

I've got a good build bro, thanks.

I will personally nuke all the dumps made with unofficial/hacked tools, we don't need them. We don't accept the dumps from non-plextor drives, why should we accept dumps from some shady builds?

What's the problem to make a 4GB RAMdisk and do all the dumping stuff there? All the calculations will be instant.

Go for it. You're threatening years of my work because of a shit fit, I'm already done here.

All my posts and submission data are released into Public Domain / CC0.

Come on guys. It's as simple as implementing some command line switch to perform extra hash caclulations...

2,992

reentrant wrote:

Come on guys. It's as simple as implementing some command line switch to perform extra hash caclulations...

I've already explained those hashes are important and shouldn't be disabled. DIC dumps every disc upto 15-20 minutes on 4x, while the hash calculating takes maybe a minute (when reading data from HDD, from ramdisk it should be faster). If "the years of your work" don't deserve that minute - better not to do any dumping at all.

2,993 (edited by sadikyo 2021-08-23 21:20:06)

F1ReB4LL wrote:

I will personally nuke all the dumps made with unofficial/hacked tools, we don't need them. We don't accept the dumps from non-plextor drives, why should we accept dumps from some shady builds?

I just want to add my 2 cents here.  Please be aware of how this comes across.  I understand we have to have certain standards such as the plextor requirement for standard submissions, but there are ALWAYS going to be some special exception cases, or types of discs that can't be dumped via the normal methods.  And if we make a blanket block on any such submission entirely, there will be a lot of content missing.

I'm not sure if that was your intention, but the statement does appear a bit threatening about nuking large batches of content.  I'm going to assume this is not an actual intention.  We certainly wouldn't make a large decision like that without much discussion and hashing out the details.

I think both you and also user7 have made plenty of great contributions to this project over the years.  Let's not let an issue debate cause any actual contention or issues.

Oh, btw...I think his comment about a good build was talking about his computer.  Not a build of DIC.

2,994 (edited by hiker13526 2021-08-23 19:58:58)

If the descrambling isn't reversible for discs with mastering errors or unusual scrambling, won't it have to be redumped regardless of what the log shows?

If the .scm hash is the One True Hash, why not store it in the database?

Plextor PX-708A | sarami's DiscImageCreator | CloneCD | CDManipulator | Protection ID v6.8.5 | edccchk v1.26
PVD_Dumper.py

2,995 (edited by bsbt 2021-08-23 20:59:32)

The hashing phase is slow at least in part due to it only reading 2K at a time from the file. There's even a comment to that effect.

https://github.com/saramibreak/DiscImag … l.cpp#L557

Changing it to something like this (reading 512K at a time) should be noticeably quicker (from my experience writing something similar in C#, I haven't tested this change in DIC):

#define CHUNK_SIZE 524288

        BYTE data[CHUNK_SIZE] = {};
        UINT64 ui64Remaining = ui64FileSize;
        OutputString("Hashing: %s\n", pszFnameAndExt);
        while (ui64Remaining > 0) {
            UINT uiChunkSize = ui64Remaining < CHUNK_SIZE ? ui64Remaining : CHUNK_SIZE;
            if (fread(data, sizeof(BYTE), uiChunkSize, fp) < uiChunkSize) {
                OutputErrorString("Failed to read: read size %u [F:%s][L:%d]\n", uiChunkSize, _T(__FUNCTION__), __LINE__);
                return FALSE;
            };
            nRet = CalcHash(&crc32, &context, &sha, data, uiChunkSize);
            if (!nRet) {
                break;
            }
            ui64Remaining -= uiChunkSize;
        }

2,996

hiker13526 wrote:

If the descrambling isn't reversible for discs with mastering errors or unusual scrambling, won't it have to be redumped regardless of what the log shows?

It is reversible for normal cases, but some dumps are a mess of descrambled and scrambled data. Also some scrambling/descrambling tools handle the data differently (the themabus'es descrambler adds weird headers to zeroed sectors, for example).

hiker13526 wrote:

If the .scm hash is the One True Hash, why not store it in the database?

Ask iR0b0t. scm contains what was read from the disc, img and bins are conversions.

bsbt wrote:

The hashing phase is slow at least in part due to it only reading 2K at a time from the file. There's even a comment to that effect.

Probably also worth not to hash img and bin separately for the single-track dumps.

2,997

F1ReB4LL wrote:
user7 wrote:

I've got a good build bro, thanks.

I will personally nuke all the dumps made with unofficial/hacked tools, we don't need them. We don't accept the dumps from non-plextor drives, why should we accept dumps from some shady builds?

IMO, this is not acceptable behaviour from a redump mod. I would strongly recommend actually making sure you understand someone before immediately taking harsh and counterproductive actions like this. user7 clearly was refering to his PC, not a build of DIC, and if you were unsure, you should have asked him what he meant first. Even if he didn't, you could have talked things over in a much less hostile manner.

PX-4824TA (offset +98), PX-755SA (offset +30), ASUS BW-16D1HT (offset +6)

F1ReB4LL wrote:

but some dumps are a mess of descrambled and scrambled data

I'm not clear on the benefit of the .scm hash. What's the process for discs with a mess of descrambled and scrambled data? Is the .scm hash used, or is the .bin hash used?

Plextor PX-708A | sarami's DiscImageCreator | CloneCD | CDManipulator | Protection ID v6.8.5 | edccchk v1.26
PVD_Dumper.py

2,999

RibShark wrote:

user7 clearly was refering to his PC, not a build of DIC, and if you were unsure, you should have asked him what he meant first

Then he has nothing to worry about, what's the problem? Just warned people not to use untested/hacky builds, we have lots of problems with specific official & test versions of DIC, we don't need additional problems with hacky ones.

RibShark wrote:

Even if he didn't, you could have talked things over in a much less hostile manner.

No hostility at all, I haven't said anything about his dumps, the warning was neutral.

hiker13526 wrote:
F1ReB4LL wrote:

but some dumps are a mess of descrambled and scrambled data

I'm not clear on the benefit of the .scm hash. What's the process for discs with a mess of descrambled and scrambled data? Is the .scm hash used, or is the .bin hash used?

What do you mean by "used"? In db? Db only uses cue-bin images, ofc. SCM hash is needed when/if you decide to verify some descrambled image by scrambling it back. Messy discs may give a different result, depends on the case. Also, there are known mastering errors when the scrambled CD image contains unscrambled sectors and the descrambled data contains scrambled sectors - again, if you don't have .scm checksums and don't have the scm image anymore, you can't really confirm the image is correct.

F1ReB4LL wrote:

What do you mean by "used"? In db? Db only uses cue-bin images, ofc. SCM hash is needed when/if you decide to verify some descrambled image by scrambling it back. Messy discs may give a different result, depends on the case. Also, there are known mastering errors when the scrambled CD image contains unscrambled sectors and the descrambled data contains scrambled sectors - again, if you don't have .scm checksums and don't have the scm image anymore, you can't really confirm the image is correct.

Would you advise that ideally the database would have .scm hashes?

Plextor PX-708A | sarami's DiscImageCreator | CloneCD | CDManipulator | Protection ID v6.8.5 | edccchk v1.26
PVD_Dumper.py