I could use Scrapbook to save web pages but is there something I can use to save multiple pages or entire sites? It's just for personal use. Any ideas?
Bev.
On 13/02/13 17:35, Bev Nicolson wrote:
I could use Scrapbook to save web pages but is there something I can use to save multiple pages or entire sites? It's just for personal use. Any ideas?
I once askeda similar question, and 'Putty'was the answer.
There may be a Linux version by now - or run it in Virtual Box.
On 13/02/13 19:38, Anthony Anson wrote:
On 13/02/13 17:35, Bev Nicolson wrote:
I could use Scrapbook to save web pages but is there something I can use to save multiple pages or entire sites? It's just for personal use. Any ideas?
I once askeda similar question, and 'Putty'was the answer.
There may be a Linux version by now - or run it in Virtual Box.
Eh? Putty? Really?? AFAIK, putty is a program (mainly of use in Windows) that provides SSH terminals to connect securely to other computers. I there is a Linux version, but it's not much use as you can just type ssh to get a SSH session.
To get a entire website, my thought would be wget e.g.
http://www.linuxjournal.com/content/downloading-entire-web-site-wget
If you download someone's website, you're almost certainly falling foul of copyright rules at the very least, and possibly the (er) computer miss-use act (or whatever it's called). If you try an access part of a website that the owner didn't intend for you to access (e.g. you access a hidden part of the website the owner didn't want anyone to access, and they haven't given you permission) then you're probably breaking the law.
However, IANAL!
For less than a whole site, I've used the Opera web browser to save pages (File/Save (or save as?)). Firefox has File/Save Page As
That might help.
Good luck! Steve
On 13/02/13 23:05, steve-ALUG@hst.me.uk wrote:
On 13/02/13 19:38, Anthony Anson wrote:
On 13/02/13 17:35, Bev Nicolson wrote:
I could use Scrapbook to save web pages but is there something I can use to save multiple pages or entire sites? It's just for personal use. Any ideas?
I once askeda similar question, and 'Putty'was the answer.
There may be a Linux version by now - or run it in Virtual Box.
Eh? Putty? Really??
IIRC my informant was pretty clued: if it was whom ISTR it was, he's a Debian Developer and otherwise a pretty techie individual. If I am wrong, it would have been a Sheddi, and any misinformation (Miss Information?) there is pretty quickly pounced-on.
In any case, rather than dissect sites to see how they were constructed I bought a book (HTML In Easy Steps), and did it from the bottom up (Oo-er!), so never tried Putty, so cannot comment on how effective it is.
I'd recommend HTML In Easy Steps to almost anyone. While it doesn't talk down to you, it doesn't assume you know alot, which suits me down to the ground.
<aside> Having a lovely time at the Carvery, being waited on hand and foot. They've let me get up and sit in a chair today - T/b semi-frozen- back in a bit.
AFAIK, putty is a program (mainly of use in Windows) that provides SSH terminals to connect securely to other computers. I there is a Linux version, but it's not much use as you can just type ssh to get a SSH session.
On 14/02/13 09:42, Anthony Anson wrote:
T/b semi-frozen- back in a bit.
Wouldn't let me maximise a minimised pane lurking on the bottom bar, minimise T/bird - nuffin, though it was good enough to let me post the item. Once posted, the low-lurking minimised pane became accessible.
There's absolutely no chance of my making the Norwich meeting tonight - sorry - N&NUH did a maximum job rather than a patching like last time, and I'm still attached to a vacuum pack (for surface application of graft and dressing, and as a drain...) and a drip at night. (No, they haven't supplied me with their most inept nurse to keep me quiet. Now there's - - - - - <no carrier> )
See you later.
Oh, before I disappear, where on this box (MSi U-100) am I likely to find instructions for killing the mousepad/left&right mouseclick typing-trap? Get anywhere near the thing while typing and the cursor can end-up anywhere.
On Wed, 13 Feb 2013 23:05:23 +0000 steve-ALUG@hst.me.uk allegedly wrote:
To get a entire website, my thought would be wget e.g.
http://www.linuxjournal.com/content/downloading-entire-web-site-wget
If you download someone's website, you're almost certainly falling foul of copyright rules at the very least, and possibly the (er) computer miss-use act (or whatever it's called). If you try an access part of a website that the owner didn't intend for you to access (e.g. you access a hidden part of the website the owner didn't want anyone to access, and they haven't given you permission) then you're probably breaking the law.
wget respects the robots.txt exclusion standard. So you should be OK scraping the site for off-line viewing. After all, most websites get indexed by search engines and the web owner must expect the site to be viewed. (But IANAL either!)
Mick ---------------------------------------------------------------------
blog: baldric.net gpg fingerprint: FC23 3338 F664 5E66 876B 72C0 0A1F E60B 5BAD D312
---------------------------------------------------------------------
On 14 February 2013 15:45, Brett Parker iDunno@sommitrealweird.co.uk wrote:
On 13 Feb 17:35, Bev Nicolson wrote:
I could use Scrapbook to save web pages but is there something I can use to save multiple pages or entire sites? It's just for personal use. Any ideas?
wget in spider mode...
or httrack, which is easier to use.
Cheers,
Brett Parker
Thanks all. Perhaps I should have been more specific as I was thinking that if a program could save an entire site, then it could deal with a discussion thread (which is actually what I'm after.)
Wrestling with the Save page as feature suggested by Steve. (It seems to save one page but not links on it.)
Bev.
On 14 Feb 16:48, Bev Nicolson wrote:
On 14 February 2013 15:45, Brett Parker iDunno@sommitrealweird.co.uk wrote:
On 13 Feb 17:35, Bev Nicolson wrote:
I could use Scrapbook to save web pages but is there something I can use to save multiple pages or entire sites? It's just for personal use. Any ideas?
wget in spider mode...
or httrack, which is easier to use.
Cheers,
Brett Parker
Thanks all. Perhaps I should have been more specific as I was thinking that if a program could save an entire site, then it could deal with a discussion thread (which is actually what I'm after.)
httrack can do that...
Wrestling with the Save page as feature suggested by Steve. (It seems to save one page but not links on it.)
Bev.
main@lists.alug.org.uk http://www.alug.org.uk/ http://lists.alug.org.uk/mailman/listinfo/main Unsubscribe? See message headers or the web site above!