-
Notifications
You must be signed in to change notification settings - Fork 78
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Flawed Architecture: The Code that the Patches are Applied to, is Expected to be Available at a Single URL #144
Comments
It is impossible to have a flawed architecture when there is no architecture at all. An architect's position is available. Run your scheme by the mailing list, and in all likelyhood, start pushing. The only requirement is that there may not be usability regressions (especially for non-developers). |
Thank You for the answer and for the encouragement. I'll need to write the downloader first, regardless of, whether the proposal So far all of my code has been running only on Linux and BSD and The other issue with my proposed scheme is that it uses the Build systems tend to be special purpose applications software What regards to development methodology, then |
The Flaw/Bug
Currently the patching Makefiles seem to contain some
mechanism for downloading the source from a single URL.
For example, at least one of the 2017_08_18 versions
of the libevent
contains the following 5 lines:
If, for whatever reason X, Y, Z
the single URL is not available, the collection of rumprun-packages
that is meant to be available over a longer timeperiod, becomes broken.
If there are dependencies between the rumprun-packages, then
the lack of a package at the "rootlike level" of the dependence tree
makes the whole tree broken, unavailable.
Proposed System
A smarter solution is to describe the patchable code through some
SHA256 or other secure hash of a tar file that contains the code.
That way it does not matter, from where the code gets downloaded
and different users can use "warez-like" file sharing networks for
downloading the packages by using Bittorrent-like solutions.
The packaging system should just include the size and secure hash
of the tar-file that contains the patchable code.
A Simple Bootstrapping System
The central repository, at this day and age, the very Git repository that
this bug report is attached to, should contain a plain text file,
where there is at most one URL per line and
those URL's refer to other text files all around the world, at the servers
of different volunteers, who serve files out using plain http/https.
The text files at the volunteers' servers (hereafter: local_list_of_files), contain
RELATIVE FILE PATHS of the tar files that contain the patchable code.
Demo
My Declaration of Interests/Biases
I'm very biased by writing this bug report right now, because
I have my own small project called Silktorrent, which
ended up being my personal self-education endeavor and where I
tried to develop base technology for totally censorship proof web
applications, including software package distribution. Part of the Silktorrent
use case is that the Internet is totally offline and people can exchange
files only with USB-sticks, possibly with "mail-pigeon-drones" that
transport the USB-sticks. The concept of defining files through
their location, URL, does not make sense in that scenario
and as it turns out, the P2P file sharing systems define files
not by their location, URL, but through the secure hash of the file.
The end result is one efficiency wise terribly stupidly written Bash-and-Ruby script (archival copy,
if no command line arguments are given, then the script prints usage instructions)
that, despite all the "bloat", does essentially only one thing:
it creates a tar-file and renames the tar-file according to its secure hash.
(Actually, the script can also "unpack" the tar files and
verify the file name format without reading the file and
salts the tar-file at its creation to make it possible to "package" the same "payload" to
tar files that have different hashes, which forces the censors to
download at least parts of the tar-files to see, whether the tar-file contains censored
material. At some point the downloading and checking should
overwhelm the censoring system.)
There's no need to use the Silktorrent script for the rumprun-packages,
because probably a simpler tar-file creation and renaming implementation
will work just fine, but I use my script for this demo.
The Core of the Demo
At some "central" location, quotes because it doesn't need to be, should not be, central,
there is a list of URLs to text files that list relative file paths.
An example (archival copy) of one such URL:
By combining the URL with relative file paths
at the "list_of_relative_file_paths.txt",
URLs of the tar-files can be derived.
Thank You for reading my comment.
The text was updated successfully, but these errors were encountered: