Go Back   FileForums > Game Backup > PC Games > PC Games - CD/DVD Conversions > Conversion Tutorials

Reply
 
Thread Tools Display Modes
  #61  
Old 07-08-2019, 15:20
Razor12911's Avatar
Razor12911 Razor12911 is offline
Away
 
Join Date: Jul 2012
Location: South Africa
Posts: 3,557
Thanks: 1,941
Thanked 9,930 Times in 2,140 Posts
Razor12911 is on a distinguished road
Update available

1908_R2
- Updated history size feature
- Added save/load history file feature
- Improved processing speed

The benefits of the usage history database files

I'll use GTAV as an example, it's a huge game and has a lot of highly compressed streams.

So if you were compressing this game and you messed up the srep/lolz settings or your PC shutdown unexpectedly due to several reasons like powercuts or windows forcing updates and etc, if xtool already stored a database then it will load it up with all the information that was used in previous precompression session on the same input to speed up the process because it will know what needs to be done and what settings should be used.

Here's an example on update.rpf (GTAV)

on first run:
Code:
Compressed 1 file, 814,551,040 => 1,616,722,602 bytes. Ratio 198.48%
Compression time: cpu 1.78 sec/real 77.43 sec = 2%. Speed 10.52 mB/s
since I have processed this input before, xtool loaded up history file:
Code:
Compressed 1 file, 814,551,040 => 1,616,722,602 bytes. Ratio 198.48%
Compression time: cpu 2.50 sec/real 26.25 sec = 10%. Speed 31.03 mB/s
The reason I decided to add this feature was because of games compressed with ridiculously high compression settings resulting in precompression being very slow so then I wanted people to save time by allowing at least one of us to process the game then share their database file with everyone so that they can use it to save time on precompressing the same game.

This xtool is already faster than the old one and I just had to find more ways to give you guys more speed.

Note: It doesn't matter if the input is different from the one that was used to create a specific history file, an example to explain this, if one used GTAV with latest update to make a history file, a person who has the first release of GTAV without any updates can still use the same history file without problems or visa versa, if a game got updated and you wanted to compress it again with the new files, you can use the old history file.
__________________
Got a lot going on in my life... (Comeback date 20-11-2021)
I occasionally upload stuff here
Razor12911#6134
Reply With Quote
The Following 14 Users Say Thank You to Razor12911 For This Useful Post:
78372 (07-08-2019), Behnam2018 (30-01-2020), COPyCAT (23-08-2020), DiCaPrIo (08-08-2019), ffmla (08-08-2019), Harsh ojha (07-08-2019), IgorKolesnik (08-08-2019), oltjon (10-08-2019), rambohazard (20-08-2019), Sergey3695 (08-08-2019), shazzla (09-08-2019), ShivShubh (24-11-2019), Simorq (07-08-2019), ZAZA4EVER (08-08-2019)
Sponsored Links
  #62  
Old 08-08-2019, 00:02
Harsh ojha's Avatar
Harsh ojha Harsh ojha is offline
Registered User
 
Join Date: May 2019
Location: INDIA
Posts: 78
Thanks: 468
Thanked 71 Times in 36 Posts
Harsh ojha is on a distinguished road
Test 1908_R2

Game GTA V use x64w.rpf
Size : 893 MB


Creating archive

#1 rzlib

FreeArc 0.67 (March 15 2014) Using additional options: --logfile=_Compression.log
Creating archive: rzlib_GTAV.arc using rzlib
Memory for compression 0b, decompression 0b, cache 16mb
Compressed 1 file, 937,340,928 => 937,341,048 bytes. Ratio 100.00%
Compression time: cpu 1.78 sec/real 91.65 sec = 2%. Speed 10.23 mB/s
All OK
-------------------------------------------------------------------------------------------------------

#2 zlib

FreeArc 0.67 (March 15 2014) Using additional options: --logfile=_Compression.log
Creating archive: zlib_GTAV.arc using rzlib
Memory for compression 0b, decompression 0b, cache 16mb
Compressed 1 file, 937,340,928 => 937,341,048 bytes. Ratio 100.00%
Compression time: cpu 1.67 sec/real 107.49 sec = 2%. Speed 8.72 mB/s
All OK
-----------------------------------------------------------------------------------------------------------

#3reflate

FreeArc 0.67 (March 15 2014) Using additional options: --logfile=_Compression.log
Creating archive: reflate_GTAV.arc using reflate
Memory for compression 0b, decompression 0b, cache 16mb
Compressed 1 file, 937,340,928 => 937,341,048 bytes. Ratio 100.00%
Compression time: cpu 1.80 sec/real 61.87 sec = 3%. Speed 15.15 mB/s
All OK

---------------------------------------------------------------------------------------------------------------------------------

Extracting archive
#1 reflate

FreeArc 0.67 (March 15 2014) Using additional options: --logfile=_Compression.log
Extracting archive: reflate_GTAV.arc
Extracted 1 file, 937,341,048 => 937,340,928 bytes. Ratio 100.00%
Extraction time: cpu 3.22 sec/real 27.71 sec = 12%. Speed 33.83 mB/s
All OK

-----------------------------------------------------------------------------------------------------------------------------------

#2 rzlib

FreeArc 0.67 (March 15 2014) Using additional options: --logfile=_Compression.log
Extracting archive: rzlib_GTAV.arc
Extracted 1 file, 937,341,048 => 937,340,928 bytes. Ratio 100.00%
Extraction time: cpu 3.19 sec/real 59.93 sec = 5%. Speed 15.64 mB/s
All OK

-------------------------------------------------------------------------------------------------------------------------------------

#3 zlib

FreeArc 0.67 (March 15 2014) Using additional options: --logfile=_Compression.log
Extracting archive: zlib_GTAV.arc
Extracted 1 file, 937,341,048 => 937,340,928 bytes. Ratio 100.00%
Extraction time: cpu 3.06 sec/real 38.95 sec = 8%. Speed 24.07 mB/s
All OK

end
__________________
Video Creator
Discord - Harsh_Ojha_748 #8782
Reply With Quote
  #63  
Old 08-08-2019, 10:34
Sergey3695 Sergey3695 is offline
Registered User
 
Join Date: Mar 2013
Location: Russia
Posts: 81
Thanks: 41
Thanked 71 Times in 38 Posts
Sergey3695 is on a distinguished road
reflate didn't work with t100p when packing. add setting lvl. for reflate pls and add option for disable autodetect lvl. xD
Reply With Quote
  #64  
Old 08-08-2019, 11:28
elit elit is offline
Registered User
 
Join Date: Jun 2017
Location: sun
Posts: 208
Thanks: 150
Thanked 277 Times in 94 Posts
elit is on a distinguished road
There is much better way to detect cmp level than any current tools do(including precomp) and is simple to implement:

When you find first potential stream, first use all levels until match(as you probably do), but instead of fixing it for rest of data as granted, do at least say 9 more(up to 10). If 80%(or 8 of 10) of those streams show same specific level(say -9 or -6), then keep that for the rest of data.

If less than 80%, do all levels check for another 10 streams and average with previous 10. Now if average(of 20 streams) is 80% specific level then all is good for the rest of data, otherwise go full check again another 10 streams etc..

You can set minimum threshold to 70% if 80% threshold is too much but I think it should be fine. In other words, with this simple to implement algo you dont ever need option to manually set specific level, it will automatically determine whether streams are too mixed and it needs to test each on all levels(extremely unlikely), more likely it will eventually get right one that is in majority. This will also prevent false detection unless majority of first 10 streams are wrong levels, if thats common then you could do in steps of 100's instead of 10 and so on.
Reply With Quote
  #65  
Old 08-08-2019, 12:35
Sergey3695 Sergey3695 is offline
Registered User
 
Join Date: Mar 2013
Location: Russia
Posts: 81
Thanks: 41
Thanked 71 Times in 38 Posts
Sergey3695 is on a distinguished road
rly? lol. i'm look also on unpack time. example: better low lvl for save 1 min. my time or ~4 mb. better compression?

Last edited by Sergey3695; 08-08-2019 at 12:56.
Reply With Quote
  #66  
Old 08-08-2019, 15:37
Razor12911's Avatar
Razor12911 Razor12911 is offline
Away
 
Join Date: Jul 2012
Location: South Africa
Posts: 3,557
Thanks: 1,941
Thanked 9,930 Times in 2,140 Posts
Razor12911 is on a distinguished road
Quote:
Originally Posted by Harsh ojha View Post
Game GTA V use x64w.rpf
Size : 893 MB


Creating archive

#1 rzlib

FreeArc 0.67 (March 15 2014) Using additional options: --logfile=_Compression.log
Creating archive: rzlib_GTAV.arc using rzlib
Memory for compression 0b, decompression 0b, cache 16mb
Compressed 1 file, 937,340,928 => 937,341,048 bytes. Ratio 100.00%
Compression time: cpu 1.78 sec/real 91.65 sec = 2%. Speed 10.23 mB/s
All OK
-------------------------------------------------------------------------------------------------------

#2 zlib

FreeArc 0.67 (March 15 2014) Using additional options: --logfile=_Compression.log
Creating archive: zlib_GTAV.arc using rzlib
Memory for compression 0b, decompression 0b, cache 16mb
Compressed 1 file, 937,340,928 => 937,341,048 bytes. Ratio 100.00%
Compression time: cpu 1.67 sec/real 107.49 sec = 2%. Speed 8.72 mB/s
All OK
-----------------------------------------------------------------------------------------------------------

#3reflate

FreeArc 0.67 (March 15 2014) Using additional options: --logfile=_Compression.log
Creating archive: reflate_GTAV.arc using reflate
Memory for compression 0b, decompression 0b, cache 16mb
Compressed 1 file, 937,340,928 => 937,341,048 bytes. Ratio 100.00%
Compression time: cpu 1.80 sec/real 61.87 sec = 3%. Speed 15.15 mB/s
All OK

---------------------------------------------------------------------------------------------------------------------------------

Extracting archive
#1 reflate

FreeArc 0.67 (March 15 2014) Using additional options: --logfile=_Compression.log
Extracting archive: reflate_GTAV.arc
Extracted 1 file, 937,341,048 => 937,340,928 bytes. Ratio 100.00%
Extraction time: cpu 3.22 sec/real 27.71 sec = 12%. Speed 33.83 mB/s
All OK

-----------------------------------------------------------------------------------------------------------------------------------

#2 rzlib

FreeArc 0.67 (March 15 2014) Using additional options: --logfile=_Compression.log
Extracting archive: rzlib_GTAV.arc
Extracted 1 file, 937,341,048 => 937,340,928 bytes. Ratio 100.00%
Extraction time: cpu 3.19 sec/real 59.93 sec = 5%. Speed 15.64 mB/s
All OK

-------------------------------------------------------------------------------------------------------------------------------------

#3 zlib

FreeArc 0.67 (March 15 2014) Using additional options: --logfile=_Compression.log
Extracting archive: zlib_GTAV.arc
Extracted 1 file, 937,341,048 => 937,340,928 bytes. Ratio 100.00%
Extraction time: cpu 3.06 sec/real 38.95 sec = 8%. Speed 24.07 mB/s
All OK

end
doesn't seem like libraries are loaded

Quote:
Originally Posted by Sergey3695 View Post
reflate didn't work with t100p when packing. add setting lvl. for reflate pls and add option for disable autodetect lvl. xD
Well up to you, if you know how to use reflate lvls
__________________
Got a lot going on in my life... (Comeback date 20-11-2021)
I occasionally upload stuff here
Razor12911#6134
Reply With Quote
  #67  
Old 08-08-2019, 15:41
Razor12911's Avatar
Razor12911 Razor12911 is offline
Away
 
Join Date: Jul 2012
Location: South Africa
Posts: 3,557
Thanks: 1,941
Thanked 9,930 Times in 2,140 Posts
Razor12911 is on a distinguished road
Quote:
Originally Posted by Sergey3695 View Post
rly? lol. i'm look also on unpack time. example: better low lvl for save 1 min. my time or ~4 mb. better compression?
and you want me to remove auto detect.

This is why I add it in the first place, especially when it comes to reflate because most people don't know how it works. Low level does not mean you save time and at times, if you choose incorrect levels you end up with negative ratio. Example is GTAV, people used level 1 just because it was faster, resulted in horrible results then at some point people just decided to use level 9 on everything because "most" games are highly compressed, then game mad max which was level 1 compressed, they got shitty results again so yea, if you want level settings, I guess it's not a problem to add it but I added autodetect for a reason, because of people messing up
__________________
Got a lot going on in my life... (Comeback date 20-11-2021)
I occasionally upload stuff here
Razor12911#6134
Reply With Quote
  #68  
Old 08-08-2019, 16:09
Razor12911's Avatar
Razor12911 Razor12911 is offline
Away
 
Join Date: Jul 2012
Location: South Africa
Posts: 3,557
Thanks: 1,941
Thanked 9,930 Times in 2,140 Posts
Razor12911 is on a distinguished road
Quote:
Originally Posted by elit View Post
There is much better way to detect cmp level than any current tools do(including precomp) and is simple to implement:

When you find first potential stream, first use all levels until match(as you probably do), but instead of fixing it for rest of data as granted, do at least say 9 more(up to 10). If 80%(or 8 of 10) of those streams show same specific level(say -9 or -6), then keep that for the rest of data.

If less than 80%, do all levels check for another 10 streams and average with previous 10. Now if average(of 20 streams) is 80% specific level then all is good for the rest of data, otherwise go full check again another 10 streams etc..

You can set minimum threshold to 70% if 80% threshold is too much but I think it should be fine. In other words, with this simple to implement algo you dont ever need option to manually set specific level, it will automatically determine whether streams are too mixed and it needs to test each on all levels(extremely unlikely), more likely it will eventually get right one that is in majority. This will also prevent false detection unless majority of first 10 streams are wrong levels, if thats common then you could do in steps of 100's instead of 10 and so on.
I'd like to see this method fly

especially with the case of xtool because of multi threading, precomp might find this useful because it is single threaded but xtool? no. It's not something I have not thought about before but look at it from this perspective. you're processing DiRT Rally, a game with large streams, so to detect level according to you is to process the first stream, check it level by level. Take note when it comes to deflate, there are not just 9 options (1-9), but there are 81 options, each level has memory settings (9 of them as well) with also affect the checksum of the stream if you are doing trial and error approach, so you're telling me it's best to try 81 levels on the first stream, so the next 9 streams can use the same method, during those 81 trial and errors, the other threads in xtool, what would the other threads be doing during that time? , a user selected to use 8 threads but it's still just 1 thread that needs to figure out what options to use and then after that is done, they can proceed. As those threads process the streams using the level they found from the first stream, if it's incorrect, they all now have to go trial and error for all those streams whichever way you look at it, this method has drawbacks in terms of time the other threads spend by not doing anything and just waiting, other drawback is, it's not really guaranteed that all those 81 options you'll be trying will give back perfect checksum so you can be making the other threads waiting for something that will never come.

xtool is fast not just because of multi threading, you can even compare it with precomp using 1 thread. I've thought about level detection and etc.

Here's how xtool works, the scanner itself can even give ideas of what level was used, zlib has a 2 byte header that is generated based on what level was used to compress a specific stream but sometimes this is not enough because some streams are headerless. This is where I decided to add a statistics system that stores information as all threads process the streams, this statistic system I introduced allows xtool to determine which cmp levels popped up the most and which ones were less, so if it were a case of trial and error, it wouldn't be trying from level 1-9, it would be the level that appeared the most using previous stream data(could even be in the order (6,5,9,8,1,2,3,4..). There are more strategies I've added for cmp level detection and they are all quick because they use stats
__________________
Got a lot going on in my life... (Comeback date 20-11-2021)
I occasionally upload stuff here
Razor12911#6134
Reply With Quote
  #69  
Old 08-08-2019, 20:51
Sergey3695 Sergey3695 is offline
Registered User
 
Join Date: Mar 2013
Location: Russia
Posts: 81
Thanks: 41
Thanked 71 Times in 38 Posts
Sergey3695 is on a distinguished road
Quote:
Originally Posted by Razor12911 View Post
and you want me to remove auto detect.
add setting - turn off only/
Reply With Quote
  #70  
Old 09-08-2019, 15:17
elit elit is offline
Registered User
 
Join Date: Jun 2017
Location: sun
Posts: 208
Thanks: 150
Thanked 277 Times in 94 Posts
elit is on a distinguished road
Razor,

you mentioned xtool does auto detection on the first stream. Does it iterate through all 81 levels until match is found, or only few known to be used most often?

As for my idea,
It don't need to go through all 81 every time, only until match is found and then next stream can be tested with that level found in previous stream first. This means if streams are not mixed there will be no slowdown because first try is a match already. Also your own test implementation on first stream seem fast, why not use same technique, whatever is it doing?

I have never noticed ztool/xtool being slow at beginning, it seems to start immediately. Would there really be such big difference if it had to try 10 streams instead of 1, especially with optimization I mentioned above(starting with previously found level)? Perhaps up to few seconds of delay should not be much problem in scope of thousand streams and GB's of data?

As for MT, that's not a big deal we don't want to test whole data only first couple of streams(again assuming it should not take long).

You mentioned using stats, that sounds actually very similar to what I am talking about. In fact, if I understand correctly you may have already implemented what I am talking about here . But interestingly, you mentioned collecting stats from threads, that pretty much equals to cmp levels. So you in fact are collecting from multiple streams, not just first?

Last edited by elit; 09-08-2019 at 15:19.
Reply With Quote
  #71  
Old 09-08-2019, 16:22
Razor12911's Avatar
Razor12911 Razor12911 is offline
Away
 
Join Date: Jul 2012
Location: South Africa
Posts: 3,557
Thanks: 1,941
Thanked 9,930 Times in 2,140 Posts
Razor12911 is on a distinguished road
"you mentioned xtool does auto detection on the first stream. Does it iterate through all 81 levels until match is found, or only few known to be used most often?"
- it does for all streams, all 81 options but very quickly, takes milliseconds per stream to go through all 81 options.

"It don't need to go through all 81 every time"
- it doesn't go through all 81, sometimes even the first option it tries is correct and it stops, if it isn't correct then the 2nd one is most likely correct because of stats. XTool requests mode (value that appears most often), if that option fails, it requests mode in 2nd order (value that appears the 2nd most often) and so forth... which is why trials on all streams is necessary because it keeps the stats function filled with updated information about the current input.

"next stream can be tested with that level found in previous stream first."
- this method is the same as stat but it stores information limited to one stream which happens to be the previous stream, the approach of xtool is it stores a number of previous stream information, default is 256 streams.
Let's say for example
if the first stream was gave back level 9, the next stream is determined to be level 9, roughly the same as what you suggested, but what if the third stream is level 6? you have go through all the trials, ok fine, now you think then next stream is level 6 because of previous stream but what if that level 6 stream was just png image within the game data but instead the entire game was level 9? you try level 6 on the next stream it fails, then again you have to go through all 81 options again. XTool apporach thanks to stats, is level 9, level 9 and level 6 is what was stored, so for the next stream, it doesn't matter if the last one was level 6, it will try level 9 because it was detected twice compared to level 6, if it wasn't level 9, it will try that level 6 before it tries all 81 options because it was at least detected once, if that fails. it then tries 81 - 2 (6-9 from stats already failed) it does this for the past 256 streams and tries options in mode orders.

"I have never noticed ztool/xtool being slow at beginning, it seems to start immediately"
- It's because of stats plus other methods I've developed that make this possible.

"Perhaps up to few seconds of delay should not be much problem in scope of thousand streams and GB's of data?"
You'll notice not just "few seconds of delay", yes few seconds for a set of streams but minutes of overall delay if you were to use this method for big streams, try this on dirt rally or doom 2016 and you'll see some incredibly low CPU utilization when using a lot of threads because other threads are obliged to wait for the first stream.

"As for MT, that's not a big deal we don't want to test whole data only first couple of streams(again assuming it should not take long)."

with large streams, it is guaranteed to take long as mentioned before, which is why xtool faces too many problems with multi threading, I've instructed the threads to try 81 levels while using stats from all the streams they can find, as a result they end up fighting and crash each other it's hilarious but you get the idea, it's just best to try all levels on all streams but let stats decide in what order they should be presented in to speed up the process.

"You mentioned using stats, that sounds actually very similar to what I am talking about."
- Yes with the exception that not the previous stream option is considered but the previous 256 stream options are considered. If from those 256 options, there are 128 level 6s, 10 level 3s , 1 level 5 and 20 level 9s, it will try them in the order: 6 >> 9 >> 3 >> 5, then since all that failed, it begins trials, 1 - 9, it will skip, 3,5,6 and 9 because those failed, after that the new discovered level is added to stats, if it was 2, it will try the new order as 6 >> 9 >> 3 >> 5 >> 2, the reason all streams have to try all options is to get rid of non appearing options, example is when the data out of nowhere started to become level 4 out of nowhere, as stats are added, the number of level 9s will decrease because of these new levels 4s until they reach a certain threshold, then the order will now be 4 >> 6 >> 9 >> 3 >> 5 >> 2, as you can see level 4 over took level 6 but level 6 is still considered because before all of the level 4s there were a lot of level 6s so yes, I really did think deep about how to speed up xtool

"I understand correctly you may have already implemented what I am talking about here"
- Yep

"So you in fact are collecting from multiple streams, not just first?"
- Yes, I had to carefully do this else the threads crash themselves while fighting for the same stream or the same information, causes headaches but I find it funny when this happens, this is what causes most problems in pzlib/ztool and old xtool but if you properly implement it, you get to enjoy all the speed that comes with it.
__________________
Got a lot going on in my life... (Comeback date 20-11-2021)
I occasionally upload stuff here
Razor12911#6134

Last edited by Razor12911; 09-08-2019 at 16:24.
Reply With Quote
The Following 2 Users Say Thank You to Razor12911 For This Useful Post:
elit (09-08-2019), Harsh ojha (09-08-2019)
  #72  
Old 09-08-2019, 18:05
elit elit is offline
Registered User
 
Join Date: Jun 2017
Location: sun
Posts: 208
Thanks: 150
Thanked 277 Times in 94 Posts
elit is on a distinguished road
Finally, now(!) I am in picture. Thanks for taking your time, it does make a difference when I understand internals.

Believe or not until now I thought (z)xtool simply scan first stream and.. that's it - use same info for *all* rest of data lol . Reason I thought so was that game packs are likely using single specific level and that could in theory be enough(to pack the game, for which zxtools were designed).

Since that is not the case and your approach is clearly even much better than mine we can rest this case. Thanks again.
Reply With Quote
  #73  
Old 10-08-2019, 15:25
Simorq's Avatar
Simorq Simorq is offline
Registered User
 
Join Date: Mar 2014
Location: Iran
Posts: 642
Thanks: 3,602
Thanked 1,284 Times in 463 Posts
Simorq is on a distinguished road
1908_R2_x86 > ERROR: write error (disk full?) in compression algorithm
1908_R2_x64 > good work

decode has a bug.

Code:
decode:t1
Testing archive: data.arc
Tested 2 files, 4,023,182,817 => 2,147,912,601 bytes. Ratio 187.31%        
Testing time: cpu 1.09 sec/real 23.73 sec = 5%. Speed 90.52 mB/s
All OK

decode:t100p
Testing archive: data.arc
Tested 2 files, 4,023,182,817 => 2,147,912,601 bytes. Ratio 187.31%
Testing time: cpu 1.16 sec/real 22.47 sec = 5%. Speed 95.61 mB/s
All OK
Even with t1 it uses all cores!

Last edited by Simorq; 10-08-2019 at 15:49.
Reply With Quote
The Following 3 Users Say Thank You to Simorq For This Useful Post:
dixen (11-08-2019), Razor12911 (11-08-2019), ZAZA4EVER (10-08-2019)
  #74  
Old 11-08-2019, 08:50
doofoo24 doofoo24 is offline
Registered User
 
Join Date: Nov 2016
Location: canada
Posts: 408
Thanks: 138
Thanked 454 Times in 227 Posts
doofoo24 is on a distinguished road
Quote:
Originally Posted by Simorq View Post
1908_R2_x86 > ERROR: write error (disk full?) in compression algorithm
1908_R2_x64 > good work
also with x64 you can get "ERROR: write error (disk full?) in compression algorithm"...
i tested on doom with reflate t100p with c384mb i get ( ERROR: write error (disk full?))...
i changed to t4 it work...

for the decode it work i tested with t1 and t100p, for t1 use 11% of the cpu and for t100p it use all core...
***it's so important to emphasis do not use t100p for decode if you don't have a good cpu cooler, better to use t50p...
Attached Images
File Type: jpg 1.jpg (330.1 KB, 220 views)

Last edited by doofoo24; 11-08-2019 at 08:53.
Reply With Quote
The Following 2 Users Say Thank You to doofoo24 For This Useful Post:
Razor12911 (11-08-2019), Simorq (11-08-2019)
  #75  
Old 11-08-2019, 11:16
Razor12911's Avatar
Razor12911 Razor12911 is offline
Away
 
Join Date: Jul 2012
Location: South Africa
Posts: 3,557
Thanks: 1,941
Thanked 9,930 Times in 2,140 Posts
Razor12911 is on a distinguished road
Guys, I’ll repeat. If you encounter an error, use xtool without fa so it produces an exception message so I know what the problem is
__________________
Got a lot going on in my life... (Comeback date 20-11-2021)
I occasionally upload stuff here
Razor12911#6134
Reply With Quote
The Following User Says Thank You to Razor12911 For This Useful Post:
Simorq (11-08-2019)
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
[Dev]XTool Razor12911 Conversion Tutorials 180 23-10-2020 06:26
Project Cars Digital Edition (3xDVD5) (srep+lzma) GTX590 PC Games - CD/DVD Conversions 10 28-08-2017 08:34
Project IGI Anthology 1xCD700 CIUV2 2039 mausschieber PC Games - CD/DVD Conversions 0 24-07-2017 15:12
Space Channel 5 Part 2 Translation Project Christuserloeser DC Games 0 21-06-2004 18:16



All times are GMT -7. The time now is 14:53.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, vBulletin Solutions Inc.
Copyright 2000-2020, FileForums @ https://fileforums.com