Go Back   FileForums > Game Backup > PC Games > PC Games - CD/DVD Conversions > Conversion Tutorials

Reply
 
Thread Tools Display Modes
  #151  
Old 23-05-2020, 22:40
Gupta Gupta is offline
Registered User
 
Join Date: Aug 2016
Location: India
Posts: 393
Thanks: 131
Thanked 675 Times in 225 Posts
Gupta is on a distinguished road
Quote:
You may store some rare/large duplicated streams in a temp file, while storing small/frequent dupes in RAM - this way the excessive HDD load won't happen,
maybe he can introduce the second phase in compression, then he can store forward reference count for a stream that should ideally decrease requirements memory size beyond window size and that should increase compression too.

HDDs are very slow, I recently upgraded to nvme based storage and I can feel the speed.
__________________
XD

Last edited by Gupta; 23-05-2020 at 22:46.
Reply With Quote
Sponsored Links
  #152  
Old 28-05-2020, 11:48
panker1992's Avatar
panker1992 panker1992 is offline
Registered User
 
Join Date: Oct 2015
Location: Always Somewhere
Posts: 470
Thanks: 111
Thanked 676 Times in 270 Posts
panker1992 is on a distinguished road
Dedup

Quote:
Originally Posted by FitGirl View Post
Thanks for returning to the project, deduplication is a very useful feature.
I have an idea which will reduce the required RAM for dedup. You may store some rare/large duplicated streams in a temp file, while storing small/frequent dupes in RAM - this way the excessive HDD load won't happen, cause reads will be rare and the RAM won't be used that much. 1-2 GB is a pretty big amount even for machines with 8 GB. And for users with 4 GB installation will be almost impossible, considering srep and lolz/lzma. Even with page file. So reduction/control over used RAM is a must, I think.

I'd recommend you Halo Reach for testing dedup, it has tons of duplicate streams of a different size.
there is also a sorting match feature that can reduce ram needed and that is as follows.

srep does a very good job finding matches that are located far away!
that in order to happen makes a dictionary!
IF you sort the files you feed srep you can actually reduce ram needed and its speed

Sorting preprocession can speedup the process and cost less ram !! and remove IO overhead because NO temps
__________________
My projects : Masked Compression, lzma2(xz) on Freearc, Zstd compressor for windows
My optimizations : packjpg.exe, zstd, lzham, precomp-dev-0.45.
Reply With Quote
  #153  
Old 29-05-2020, 01:23
Razor12911's Avatar
Razor12911 Razor12911 is offline
Programmer
 
Join Date: Jul 2012
Location: South Africa
Posts: 3,357
Thanks: 1,817
Thanked 8,954 Times in 1,989 Posts
Razor12911 is on a distinguished road
Quote:
Originally Posted by FitGirl View Post
Thanks for returning to the project, deduplication is a very useful feature.
I have an idea which will reduce the required RAM for dedup. You may store some rare/large duplicated streams in a temp file, while storing small/frequent dupes in RAM - this way the excessive HDD load won't happen, cause reads will be rare and the RAM won't be used that much. 1-2 GB is a pretty big amount even for machines with 8 GB. And for users with 4 GB installation will be almost impossible, considering srep and lolz/lzma. Even with page file. So reduction/control over used RAM is a must, I think.

I'd recommend you Halo Reach for testing dedup, it has tons of duplicate streams of a different size.
Quote:
Originally Posted by Gupta View Post
maybe he can introduce the second phase in compression, then he can store forward reference count for a stream that should ideally decrease requirements memory size beyond window size and that should increase compression too.


HDDs are very slow, I recently upgraded to nvme based storage and I can feel the speed.
Quote:
Originally Posted by panker1992 View Post
there is also a sorting match feature that can reduce ram needed and that is as follows.

srep does a very good job finding matches that are located far away!
that in order to happen makes a dictionary!
IF you sort the files you feed srep you can actually reduce ram needed and its speed

Sorting preprocession can speedup the process and cost less ram !! and remove IO overhead because NO temps

Believe me I have several ideas of how to reduce memory usage before even relying on virtual memory. Optimisation is my middle name.
__________________
Welp...
Reply With Quote
The Following 8 Users Say Thank You to Razor12911 For This Useful Post:
78372 (29-05-2020), BLACKFIRE69 (29-05-2020), devil777 (30-05-2020), DiCaPrIo (29-05-2020), ffmla (29-05-2020), Gupta (29-05-2020), L0v3craft (29-05-2020), shazzla (29-05-2020)
  #154  
Old 29-05-2020, 09:13
bunti_o4u's Avatar
bunti_o4u bunti_o4u is offline
Registered User
 
Join Date: Aug 2017
Location: India
Posts: 147
Thanks: 19
Thanked 105 Times in 44 Posts
bunti_o4u is on a distinguished road
Quote:
Originally Posted by Edison007 View Post
I had time, and i added support for this files.
Also I added parsing FAT from watch_dogs, but without (de)compression yet.
It takes time to deal with the xcompress-library.
does it support stdio?

if you add stdio, it would be great..
Reply With Quote
  #155  
Old 29-05-2020, 12:30
panker1992's Avatar
panker1992 panker1992 is offline
Registered User
 
Join Date: Oct 2015
Location: Always Somewhere
Posts: 470
Thanks: 111
Thanked 676 Times in 270 Posts
panker1992 is on a distinguished road
xcompress if pure windows 10 compression, and i think it supports it by default.
__________________
My projects : Masked Compression, lzma2(xz) on Freearc, Zstd compressor for windows
My optimizations : packjpg.exe, zstd, lzham, precomp-dev-0.45.
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



All times are GMT -7. The time now is 12:40.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, vBulletin Solutions Inc.
Copyright 2000-2020, FileForums @ https://fileforums.com