On Sun, 7 Jul 2019 at 00:43, steve-ALUG@hst.me.uk wrote:
I don't think it helps the current situation, but I think most people doing something like this would host a git repository to store their scripts & track changes to them. The same idea can be used to cope with changes to configurations for files in /etc/. I'd think it would be easier to keep a manual list of top level packages added/removed using apt-get, than to try and work it out backwards from a file list.
I've never found a good way to treat an entire OS as a repository (or just the customisations, given that they can be anywhere in the system). I do maintain a list of changed files which I use to create tar based backups (different files from the subject of this thread), as well as an init script listing all the packages I need installing, but inevitably I sometimes forget to add something to my list.
(It would be nice to think that a development workflow is "I need this, so add it to my list and install it". But reality is more like "I wonder if this will work, I'll install it and try" or "Let's install several options and see which works out best". With the best will in the world, being at least able to validate the list afterwards is useful.)
[...] "apt doesn't remember which reverse dependency caused it to install rsync, but it does log all its actions in /var/log/apt, so you might find the dependency there: [...] That may help a bit, but it depends on what period the log-rotated apt log files you have.
/var/log/apt/history.log is useful but isn't exhaustive because of the reason you elude to. I have certainly made us of it in the past though.
A bigger problem for me here though is that /var/log is excluded from my tarball backups...
Regarding dependencies: I would be nice to know why I installed a package, but for now just merging 400+ file differences into one package install is a big step forward. The example I gave some way back was netpbm - I never installed it directly, it turns out that it was because I installed a screenshot package, but given that any report will also list the screenshot package I can likely work that out later.
It seems to me it would be easier to work out using apt/aptitude etc on a live system - (see the search I mentioned earlier - lots and lots of lovely links, methods etc) It seems to me it would also be easier to keep a manual list of top-level changes.
Easier, sure, but where's the fun in that!
But actually: By "easier", what this means is restoring a multi-GB backup to an SD card, booting a Pi from it, running some commands to get the package list etc, copying the result back to my desktop, then repeating the entire process with the second backup, before I can diff anything. So easier, but not quicker :-)
PS: Perhaps inevitably most of my reply is "yes, but that won't work because..." which obscures the genuinely useful information you gave me, a lot of it new to me. Thank you very much for that, it helps more than the above reply might suggest.
Not sure if this approach would work, but it's an idea:
1) get or create a clean Raspberry Pi virtual machine image 2) mount it in VirtualBox or similar 3) patch it with your tarball(s) 4) reboot (?) 5) use the "live" package-management stuff to query it
If it works (I guess it would at least rely on the tarballs containing the apt repo stuff, but there are probably loads of other reasons it might not work) then it might not be fast, but at least it would be scriptable!
Cheers, Simon
On 08/07/2019 09:09, Mark Rogers wrote:
On Sun, 7 Jul 2019 at 00:43, steve-ALUG@hst.me.uk wrote:
I don't think it helps the current situation, but I think most people doing something like this would host a git repository to store their scripts & track changes to them. The same idea can be used to cope with changes to configurations for files in /etc/. I'd think it would be easier to keep a manual list of top level packages added/removed using apt-get, than to try and work it out backwards from a file list.
I've never found a good way to treat an entire OS as a repository (or just the customisations, given that they can be anywhere in the system). I do maintain a list of changed files which I use to create tar based backups (different files from the subject of this thread), as well as an init script listing all the packages I need installing, but inevitably I sometimes forget to add something to my list.
(It would be nice to think that a development workflow is "I need this, so add it to my list and install it". But reality is more like "I wonder if this will work, I'll install it and try" or "Let's install several options and see which works out best". With the best will in the world, being at least able to validate the list afterwards is useful.)
<snip>
On Tue, 9 Jul 2019 at 11:36, Simon Ransome simon@nosher.net wrote:
Not sure if this approach would work, but it's an idea:
- get or create a clean Raspberry Pi virtual machine image
- mount it in VirtualBox or similar
Will VirtualBox work with a non-x86 system? I assume QEMU would work though (slowly).
- patch it with your tarball(s)
- reboot (?)
- use the "live" package-management stuff to query it
Interesting approach, thanks.
What it did suggest to me, though, is to start with a clean install, install as many packages as I can think I might ever need (I can always return to it sometime if needed), and from that get a list of all files and the packages they came from which I can store for use offline.
But this information is presumably available without going through that process. That led me to: http://archive.raspbian.org/raspbian/dists/buster/ .. and Contents-armhf.gz which is a list of all package files and the packages that they come from...
Thanks!
Mark