Not sure if this approach would work, but it's an idea:
1) get or create a clean Raspberry Pi virtual machine image 2) mount it in VirtualBox or similar 3) patch it with your tarball(s) 4) reboot (?) 5) use the "live" package-management stuff to query it
If it works (I guess it would at least rely on the tarballs containing the apt repo stuff, but there are probably loads of other reasons it might not work) then it might not be fast, but at least it would be scriptable!
Cheers, Simon
On 08/07/2019 09:09, Mark Rogers wrote:
On Sun, 7 Jul 2019 at 00:43, steve-ALUG@hst.me.uk wrote:
I don't think it helps the current situation, but I think most people doing something like this would host a git repository to store their scripts & track changes to them. The same idea can be used to cope with changes to configurations for files in /etc/. I'd think it would be easier to keep a manual list of top level packages added/removed using apt-get, than to try and work it out backwards from a file list.
I've never found a good way to treat an entire OS as a repository (or just the customisations, given that they can be anywhere in the system). I do maintain a list of changed files which I use to create tar based backups (different files from the subject of this thread), as well as an init script listing all the packages I need installing, but inevitably I sometimes forget to add something to my list.
(It would be nice to think that a development workflow is "I need this, so add it to my list and install it". But reality is more like "I wonder if this will work, I'll install it and try" or "Let's install several options and see which works out best". With the best will in the world, being at least able to validate the list afterwards is useful.)
<snip>