flat assembler
Message board for the users of flat assembler.

Index > Heap > sync changes only for a big text/binary file > 2GB

Author
Thread Post new topic Reply to topic
sleepsleep



Joined: 05 Oct 2006
Posts: 8975
Location: ˛                             ⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣Posts: 334455
sleepsleep
hi friends,
i was thinking something like this,

a file is created empty, zero-rise to 2 GB or 8 GB with the objective to store data with application coded add/edit/delete functions,

how is the the proper or best efficient method to sync this 8 GB data file across internet?

so any changes to master 8 GB would result in changes to another 8 GB located across pacific ocean.

how to track and update those binary/text changes only inside the 8 GB instead of whole 8 GB to sync?
Post 07 Jan 2014, 14:31
View user's profile Send private message Reply with quote
HaHaAnonymous



Joined: 02 Dec 2012
Posts: 1180
Location: Unknown
HaHaAnonymous
[ Post removed by author. ]


Last edited by HaHaAnonymous on 28 Feb 2015, 18:35; edited 1 time in total
Post 07 Jan 2014, 14:49
View user's profile Send private message Reply with quote
RIxRIpt



Joined: 18 Apr 2013
Posts: 50
RIxRIpt
Haha D:

I guess it's stupid & obvious idea:
If you have enough RAM/Storage you can keep copy of this file. When the original file changes, find differences & their offsets within file, and send them.. (then update a copy which you keep to compare original with)
Post 07 Jan 2014, 16:14
View user's profile Send private message Visit poster's website Reply with quote
cod3b453



Joined: 25 Aug 2004
Posts: 619
cod3b453
If you track/extract* the file deltas you can accumulate these changes into a transcript of operations that you can compress and send over the net, which can be a lot smaller than the data and be more efficient than searching for diffs.

*If you don't have such information you could also use divide and conquer hashing on decreasing regions or a suitably smart differencing engine on the file to find the different regions by exchange/compare the hashes from both sides.
Post 07 Jan 2014, 17:09
View user's profile Send private message Reply with quote
malpolud



Joined: 18 Jul 2011
Posts: 344
Location: Broken hippocampus
malpolud
cod3b453 wrote:
*If you don't have such information you could also use divide and conquer hashing on decreasing regions or a suitably smart differencing engine on the file to find the different regions by exchange/compare the hashes from both sides.


+1

_________________
There's nothing special about it,
It's either there when you're born or not.
Post 07 Jan 2014, 17:12
View user's profile Send private message Visit poster's website Reply with quote
typedef



Joined: 25 Jul 2010
Posts: 2913
Location: 0x77760000
typedef
What's 8 GB?

Ask console gamers. 10 to 50 GB of data to download.
Post 07 Jan 2014, 18:21
View user's profile Send private message Reply with quote
HaHaAnonymous



Joined: 02 Dec 2012
Posts: 1180
Location: Unknown
HaHaAnonymous
[ Post removed by author. ]


Last edited by HaHaAnonymous on 28 Feb 2015, 18:35; edited 1 time in total
Post 07 Jan 2014, 18:43
View user's profile Send private message Reply with quote
RIxRIpt



Joined: 18 Apr 2013
Posts: 50
RIxRIpt
cod3b453 wrote:
*If you don't have such information you could also use divide and conquer hashing on decreasing regions or a suitably smart differencing engine on the file to find the different regions by exchange/compare the hashes from both sides.

As I know hashing requires all bytes to be involved in its calculation. Wouldn't it be faster to compare two files byte-by-byte Question
Post 07 Jan 2014, 20:37
View user's profile Send private message Visit poster's website Reply with quote
cod3b453



Joined: 25 Aug 2004
Posts: 619
cod3b453
RIxRIpt wrote:
cod3b453 wrote:
*If you don't have such information you could also use divide and conquer hashing on decreasing regions or a suitably smart differencing engine on the file to find the different regions by exchange/compare the hashes from both sides.

As I know hashing requires all bytes to be involved in its calculation. Wouldn't it be faster to compare two files byte-by-byte Question
Only if you're internet connection is as fast/faster than your disk in which case you simply copy the file.
Post 07 Jan 2014, 20:46
View user's profile Send private message Reply with quote
HaHaAnonymous



Joined: 02 Dec 2012
Posts: 1180
Location: Unknown
HaHaAnonymous
[ Post removed by author. ]


Last edited by HaHaAnonymous on 28 Feb 2015, 18:35; edited 1 time in total
Post 07 Jan 2014, 23:50
View user's profile Send private message Reply with quote
typedef



Joined: 25 Jul 2010
Posts: 2913
Location: 0x77760000
typedef
HaHaAnonymous wrote:
Quote:

10 to 50 GB of data to download.

And there are people who support the idea of online OS, all your personal data and programs on the "cloud servers". This is ridiculous.

This is making the life of NSA easier, nothing more.

In my opinion.

I don't know if you mean downloading 50 GB of data is ridiculous or cloud services.
Post 08 Jan 2014, 03:19
View user's profile Send private message Reply with quote
sid123



Joined: 30 Jul 2013
Posts: 340
Location: Asia, Singapore
sid123
Quote:
And there are people who support the idea of online OS,

The entire concept is lame, Web OS which sounds utterly stupid, PC Gamers would hate
it till the end, as for coders they would not even use it since I bet that it will be a Linux
Rip off, In fact I'd rather use MS-DOS than using some crap WebOS.

_________________
"Those who can make you believe in absurdities can make you commit atrocities" -- Voltaire https://github.com/Benderx2/R3X
XD
Post 08 Jan 2014, 09:22
View user's profile Send private message Reply with quote
typedef



Joined: 25 Jul 2010
Posts: 2913
Location: 0x77760000
typedef
sid123 wrote:
Quote:
And there are people who support the idea of online OS,

The entire concept is lame, Web OS which sounds utterly stupid, PC Gamers would hate
it till the end, as for coders they would not even use it since I bet that it will be a Linux
Rip off, In fact I'd rather use MS-DOS than using some crap WebOS.


That's the only way to combat piracy I guess.

But think about this, you want to burn 10GB ISO onto a DVD. The ISO is on the server and the writer is on your PC. You'd still have to download it to come onto your disk. You wouldn't even know what they're doing with your data.


Like this : http://i.imgur.com/mvb6HsY.png
Post 08 Jan 2014, 11:50
View user's profile Send private message Reply with quote
sid123



Joined: 30 Jul 2013
Posts: 340
Location: Asia, Singapore
sid123
@typedef
That's the reason I think WebOSes are lame, I think its easier and much more trustable
to use NotePad++ to edit a file on your HDD rather than using an online editor or something similar. And what's gonna happen to those 1,000,000,000,000+ programs that were written for
Windows/Linux/Mac? Are they going to be dead because of WebOSes? The answer is NO.
WebOSes are a fun project, but making them mainstream will be a nightmare.

_________________
"Those who can make you believe in absurdities can make you commit atrocities" -- Voltaire https://github.com/Benderx2/R3X
XD
Post 08 Jan 2014, 13:09
View user's profile Send private message Reply with quote
Display posts from previous:
Post new topic Reply to topic

Jump to:  


< Last Thread | Next Thread >
Forum Rules:
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
You can attach files in this forum
You can download files in this forum


Copyright © 1999-2020, Tomasz Grysztar. Also on YouTube, Twitter.

Website powered by rwasa.