1

(50 replies, posted in General discussion)

Well done modifying the source and making a pkg, I didn't expect anyone to modify it otherwise I might have made it a little cleaner tongue

The PIC is a bit of a question mark. 3k3y gets 0x73 bytes, redump gets 0x84, and things can always change. The full PIC is dumped just in case we want more of it in the future.

GOTY edition is hard to categorise as it could mean many things, in my limited experience it means little more than a re-release with the latest patch, which is what I meant by it. When something gets re-released as game +  major expansion pack/major dlc combo on the same disc , then I think it should be in its own group (ie up to 3 groups: standalone game, expansion pack, combo). Some things will be borderline, it'll have to be discussed on a case by case basis.

It's all subject to opinion, and some of it will be hard to sort out (particularly pc, consoles play much nicer normally). Basically I think the golden rule should be if a disc could go into multiple groups, put it in a new group. If we follow it we wont go far wrong.

My ideal:

  • Follow a region order but make the generic name as generic as possible (Resident Evil > Biohazard, no additional tags like region)

  • Group re-releases of the same game (goty, dual shock etc), keep expansions (dynasty warriors xtreme legends, empires) separate and collections (deus ex complete) separate. At first glance it might seem like a good idea to group some of these situations, but it forms too many complications and introduces as many negatives as positives

  • Have the backend a number id, no parent as such (the dat creation code handles the release to grab the generic name from). clone-clone-clone... seems easier to implement + edit than parent-clone-clone... and more flexible

The dat could store an id instead of a generic name. This makes post-processing with a datter mandatory for many situations so is not ideal, but a clone dat of any sort is good.

4

(17 replies, posted in General discussion)

root@ps3-linux:/# mount /temp/petitboot/mnt/sda1 -o rw/remount
warning: can't open /etc/fstab: No such file or directory
mount: can't find /temp/petitboot/mnt/sda1 in /etc/fstab or /etc/mtab

If there's a problem with /etc/fstab I can't help, not overly qualified with linux. It could just be that the stick is no longer sda1, it might have changed (it can change just by unplugging and replugging). Or maybe you've said temp when it should be tmp (mine isn't setup so can't check).

edit:

Another question that popped in my head this morning, am I able to move to a different CFW, like Rogero?

Doubt it, haven't tried. To be honest as soon as I got it working I daren't mess with it further, the ps3 is on its last legs anyway and I don't want to push it over the edge.

5

(17 replies, posted in General discussion)

gamecaptor wrote:

1. Is there a way to eject the Blu-ray from the drive so I do not need to go back into the PS3 OS (I notice the eject button doesn't work in PS3 Linux).

My eject button works fine on a fat. Try holding the eject button for 20 seconds, it should do an eject cycle.

gamecaptor wrote:

2. When I do a "reboot" command the PS3 complains "The system was not turned off properly the last time it was used". Is this normal?

Yes. Half the time I get that, I turn the ps3 off by holding the power button. I don't think it's anything to worry about, unless perhaps you have to reboot for every disc (I do them in batches).

gamecaptor wrote:

3. How will this be used to verify ISO dumps; Is there a way for the .dats to verify this info in ISO?

I don't understand the question, so have some scattershot answers that might help wink

  • It doesn't verify isos, it makes them useful by being able to decrypt them. The PS3 section stagnated for years because the dumps weren't usable, now if the disc key is present they are (decrypt and re-encrypt with PS3Dec).

  • Currently the dats do not contain the key. What we have now is not the most automated way to do things, maybe something can be done in the future (like a small download with the disc key, or putting the disc keys in the dat)

  • The metadata does not have a confirmation orb on it's page so technically does not need to be verified, although really it should be. Someone else can get the metadata with their disc to check it, or decrypt the image with the key on the redump page to check it.

6

(17 replies, posted in General discussion)

Yes, you're good to go  big_smile

You'll know if something went wrong because it will say WARNING instead of SUCCESS at the end.

7

(17 replies, posted in General discussion)

Try reading this thread, ol had the same problem (the usb stick was mounted as read only): http://forum.redump.org/post/43338/#p43338

edit: And you should write ./GetKey not GetKey (you need to execute by path, ./ means relative to where you are now, GetKey can't be executed like cd and ls because it's not in the PATH)

8

(17 replies, posted in General discussion)

When in petitboot exit to shell. You're now in a basic linux commandline. You need to navigate to the directory containing GetKey, which is on a usb stick. Two commands you need are ls (to list the files in the current directory) and cd (to change the current directory), read up on them if unfamiliar.
Do this:

cd /tmp/petitboot/mnt
ls

Some files are listed, one of which should be the usb stick (likely usb names are something like sda1 sdb1 etc). Try them all by going in and listing them until you find your stick. Then you just put a disc in and do this:

./GetKey 3Dump.bin >output.txt

This dumps the disc key, disc id and pic and puts it into file output.txt in the same directory as getkey. You could also grab the full pic if you'd like by doing this:

./GetKey 3Dump.bin big.pic >output.txt

output.txt is the same as before, what's new is big.pic, which is the full pic. It's not required by redump, but it doesn't take much to store it, and who knows it may be useful later.

But before you do any of that, you need 3Dump.bin and put it in same directory as GetKey (easiest is both in root of usb stick), get it by running a pkg as described in the guide.

I'm going to be adding some to the database soon, so how do we want them named? Various sources refer to the magazine and disc in various ways:

On the disc itself in /PS3_GAME/PKGDIR/PARAM.SFO
Generic title: OPSM BD 2012 / 05 (Disc 70 / May 2012)

The on disc logo used in XMB
http://i45.tinypic.com/21n3zf8.png

On the disc label
Official PlayStation Magazine Issue 70 (actual it's all in caps)

On the slip cover
PlayStation Official Magazine - UK (no reference to number)

Typically I think the magazine is referred to as OPM (because OPSM is the official ps1 magazine).

My naming preference would be including the word Issue, the disc number and the date (done in the same style as xbox and 360 demos). So, this:
"Official PlayStation Magazine Issue 70: May 2012"
Or if we want coverdiscs to have similar naming across systems:
"Official PlayStation Magazine Demo Disc 70: May 2012"

If no one responds or a decision hasn't been made by the time I come to add them, I'll do it as "Official PlayStation Magazine Issue 70: May 2012", using whatever word they use for the xmas edition (if there is one).

discs -> new disc, where discs is in the menu on redump.org >> http://redump.org/newdisc/

edit: You have to be logged in.

Dumper status means you can fill in a form instead of submitting info to forums. It still has to be added by a mod, it just streamlines the process a little (discs -> new disc).

How long from the internal queue to actually in the database depends. Some mods only add certain systems, ps1 ps2 are probably quickest. if it's a popular system or the right mod about it could be in the same day.

12

(50 replies, posted in General discussion)

I think the problem is due to red ribbon, I have the same problem. It works fine in debian and petitboot, don't think it's been tested elsewhere.

Process images from certain systems for efficient archival (at present Gamecube, PS3, Wii, Xbox).

Features:

  • Can merge multiple releases from the same system into one set of files

  • For systems with padding (Gamecube, Wii, Xbox), the padding is split into a separate file. GParse can recreate the original image (with the padding file) or scrubbed image (without the padding file) on decode

  • For systems with encryption (PS3, Wii), it is decrypted

  • All of these factors can increase compression efficiency

Notes:

  • The game data (dec_file) should be compressed after processing, the padding (pad_file) shouldn't as it is (or at least should be) incompressible

  • Supported systems: Gamecube, Wii, Xbox, PS3

  • This is not a compressor. It can be a pre-processor to a compressor

help wrote:

GParse r27

Usage:

Encode (all except ps3):
GParse e <system> pad_file dec_file grp_file game_1 game_2 game_3 ...

Encode (ps3):
GParse e <system> pad_file dec_file grp_file game_1 key_1 game_2 key_2 ...

Decode:
GParse d pad_file dec_file chk_file game_1 game_2 game_3 ...

<system> : wii for wii, ngc for gamecube, xbox for xbox, ps3 for ps3 (all lowercase)
pad_file : The file the padding will be/is located in
dec_file : The file the decrypted game data will be/is located in
grp_file : Stores names, sizes, hashes
chk_file : Allows external source to check status of execution
game_#   : The next game to process
key_#    : The key used to decrypt/encrypt game_# (disc_key, not d1)

Note: Even systems without padding (ps3) have to include the pad_file in the
      commandline. It keeps things consistent and simple. Specify it as NULL

'-' in place of the file path indicates read from stdin (or write to stdout)
'NULL' in place of the file path indicates ignore file

On encode, pad_file can be stdout, NULL or a file path
On encode, dec_file must be a file path
On encode, grp_file can be NULL or a file path
On decode, pad_file can be NULL or a file path
On decode, dec_file can be stdin or a file path
On decode, chk_file can be NULL or a file path

Changes from r25 to r26

  • Fixed Xbox processing, which treated some data as padding (which means all existing Xbox processing outputs garbage when decoding as scrubbed, so existing stuff needs to be redone)

  • Breaking change to Xbox processing (which I figured didn't matter as all Xbox processing to date has been wrong anyway) to allow current (3629408 sector) and potentially new (3820880 sector) redump images to be processed together (and leave the door open for future non-breaking changes for other sizes)

Changes from r26 to r27

  • Added ps3 as a supported system (input are original aka redump aka encrypted images, which need to be decrypted)

  • OpenMP is a new requirement to allow multi-threading (for windows this means vcomp100.dll should be supplied with the program)

  • PS3 is currently the only system which uses multi-threading

GParse-r27

thank you very much, you're a godsend. The src folder even contains the source files  big_smile

Does anyone have the latest version of the source code. If so can you upload it please?

Basically multiple discs of the same game for green lit. A single disc for a game is good only to get a blue light, even if you dump it from a thousand drives.

17

(2 replies, posted in General discussion)

New version of datsplit with category option. Barely tested, but works with the ps2 dat I tried:
DatSplit v0.1.7

edit: Updated to 0.1.7

if the offset it incorrect, how is it i'm able to get some correct tracks?

Are the matching tracks just filled with zeroes? It's the only likely way an incorrect offset could yield matching tracks so it's worth checking.

19

(5 replies, posted in General discussion)

Renaming without decompression sounds impressive. Did you have to get your head stuck in the 7z format spec or was it not so bad to figure out?

Is it possible to dump the CDs by audio trap disc, or is even reading as audio not possible?

21

(7 replies, posted in News)

Thank you for all the raw data. Graph below shows number of dumps per day on average at each milestone. Counting the start of the database at the time the first dump was entered (assumed to be Biohazard psx on Jul 28 2007, 22:20)

http://img265.imageshack.us/img265/3531/redumpdumpsperday.png

22

(7 replies, posted in News)

Can someone plot all milestones against time? Might be a nice graph to show if progress is accelerating or remaining constant.

23

(3 replies, posted in General discussion)

Depends, on what .raw is. Where did it come from, what does it represent?

Ok. It might be best to only use t7z with both old and new files for comparison purposes (or at least the same compressor for both), otherwise the comparison doesn't mean much. The new version uses t7z to create the .7z file.

Also, zip should not be smaller than 7z, unless maybe zip deals with incompressible material better and the stuff being compressed is mostly incompressible (perhaps xbox padding).

Are you sure you used the same compressor + settings as the one you used before?

If so, how much bigger is the result?

Is both a h<a>.diff.<b> and h<a>.<b> present after program completion, where <a> is some number, and <b> is some number?

If true, then there was a problem with cleanup midway through program execution. 'h2.2048' (an example of h<a>.<b>) is the sector store for the second image. 'h2.diff.2048' would be the diff from 'h1.2048' to 'h2.2048'. Only one of these need to be stored to be able to rebuild the second image.

Otherwise, is a h<a>.diff.<b> present after program execution?

If false, then the diff stage was found to not be beneficial. In this case, at worst the merged files should only be marginally bigger.

If true, then for at least one image, diffing was deemed beneficial before compression (the 'diff' file was at most 7/8 the size of the 'to' file). It is possible that the compressed 'diff' file is bigger than the compressed 'to' file, but given the initial size difference seems unlikely (it may be true if using freearc on a smallish set of data like gaijin did before, but again this is unlikely to scale to bigger sets/images).

In any event, please be more descriptive in your testing. Let me know the os, files used in the test, files present after execution had completed, the text output to screen during execution, etc. In my tests, the resulting size is at worst about the same as the old result. At best the file size is much smaller.