Not specifically linux - hope no-one is bothered by that. You people are my front-line support for any compter advice.
I've just bought a T-series Thinkpad. It has Lithium-Ion batteries. AFAIR these do not have the 'memory' effect so does that really mean in practice that it is perfectly OK to give short booster charges no matter what the state of charge/discharge?
Also, I think I've seen somewhere that keeping the laptop on mains power withthe battery installed will lead to reduced battery life. Is this the case?
TIA Syd
On Fri, Aug 01, 2003 at 08:29:38AM +0100, syd wrote:
Not specifically linux - hope no-one is bothered by that. You people are my front-line support for any compter advice.
I've just bought a T-series Thinkpad. It has Lithium-Ion batteries. AFAIR these do not have the 'memory' effect so does that really mean in practice that it is perfectly OK to give short booster charges no matter what the state of charge/discharge?
You can give booster charges, would probably be best to avoid it if you can though.
Also, I think I've seen somewhere that keeping the laptop on mains power withthe battery installed will lead to reduced battery life. Is this the case?
yes, at least in my experience, had to buy an expensive replacement battery for my vaio earlier this year because i left the battery plugged in all the time and the old crap laptop I own also shows the same problem. Of course if you leave the battery disconnected from the laptop while on mains power then when the inevitable power failure happens you don't have a built in ups...
Adam
Hi,
After a discussion in a private IRC chat that turned a little sour I felt compelled to write this. I'm posting it to the ALUG list because I there's a point I'd like to get across about how I think Linux should be accessible to everyone in a way that it currently isn't - although it's something that is getting dramatically better.
Once again, I managed to cause some upset, offend people and generally unintentionally cause trouble because of my inability to get across the point I'm trying to make. When I was trying to get things to work in KDE, and then Fluxbox I reacted badly to the answers I got (and the answers I didn't get) because I was really looking at the problem from an almost hypothetical point of view.
Having tried to introduce Linux to my school, I have come under pressure to justify the change in practical advantages for users, and one of the main places my arguments fall down is how easy the interface is to use. This is why when i ask questions to anyone about Linux and get a reply that includes manually editing a config file or writing some code by hand - my negative response often suprises people and has in the past been taken as a personal insult. (Perhaps Linux is suffering from Mac-syndrome whereby its users so religiously defend it that they take any critisiscm personally - or perhaps I'm just offensive ;) )
The fact is, I see myself as the kind of person who would be perfectly willing to spend the time fiddling with code, comand line commands and config files to get something to work, hacker style, and indeed I do - but not everyone wants to (or indeed should have to) and my frustrations stem from trying to simultaneously look at the problem from a completely non-technical point of view. Although I have a background in ergonomics and graphic design in my academic studies as well as Computing, Science and Maths, I don't see myself as the kind of person who would be able to *solve* the user-friendly problems involved.
After having had this "discussion" in IRC I happened to finish reading "The Cathedral and The Bazaar" by Eric S. Raymond from the ALUG library, and was amazed how relevent the last section was to my current train of thought.
The extract here from the very last nine paragraphs of the main text of the book explains very competently the points I almost completely failed to get across in IRC:
------------------
"...there is one such question that is worth pondering: Will the Linux Community actually deliver a good end-user-friendly GUI interface for the whole system?
In the 1999 first edition of this book, I said the most likely scenario for late 2000/early 2001 has Linux in effective control of server, data centres, ISPs, and the Internet, while Microsoft maintains its grip on the desktop. By November 2000 this prediction had proved out pretty completely except in large corporate data centres, and there it looks very likely to be fulfilled within months.
Where things go from there depend on whether GNOME, KDE or some other Linux-based GUI (and the applications built or rebuilt to use it) ever get good enough to challenge Microsoft on its home ground.
If this were primarily a technical problem, the outcome would hardly be in doubt. But it isn't; it's a problem in ergonomic design and interface psycology, and hackers have been notoriously poor at these things. That is, hackers can be very good at designing interfaces for other hackers, they tend to be poor at modeling the thought processes if the other 95% of the population well enough to write interfaces that J. Random End-User and his Aunt Tillie will pay to buy.
Applications were 1999's problem; it's now clear we'll swing enough ISVs to get the ones we don't write ourselves. I believe the problem for 2001 and later is whether we can grow enough to meet (and exceed!) the interface design quality standard set by the Macintosh, combining that with the virtues of the traditional Unix way.
As of mid-2000, help may be on the way from the inventors of Macintosh! Andy Hertzfeld and other members of the original Macintosh design team have formed an open-source company called Eazel with the explicit goal of bringing the Macintosh magic to Linux.
We half-joke about 'world domination', but the only way we will get there is by *serving* the world. That means J. Random End-User and his Aunt Tillie; and *that* means learning how to think about what we do in a fundamentally new way, and ruthlessly reducing the user-visible complexity of the default environment to an absolute minimum.
Computers are tools for human beings. Ultimately, therefore, the challenges of designing hardware and software must come back to designing for human beings, *all* human beings.
This path will be long, and it won't be easy. But I think the hacker community, in alliance with its new friends in the corporate world, will prove up to the task. And, as Obi-Wan Kenobi might say, "the Source will be with us". " ------------------
I hope that in the future I will be able to contribute to bringing Linux to everyone - so that the freedom and quality of an Open Source operating sytem will not be held back by the hurdles of the interface between human and computer.
I'm sure Eric won't mind me quoting this large chunk from one of his essays, the book btw is well worth the read for anyone who hasn't already done so!
To those who I upset: Hope this explains things a bit better than I did!
-- Ben "tola" Francis
On Fri, 1 Aug 2003, Ben Francis wrote:
This is why when i ask questions to anyone about Linux and get a reply that includes manually editing a config file or writing some code by hand - my negative response often suprises people and has in the past been taken as a personal insult. (Perhaps Linux is suffering from Mac-syndrome whereby its users so religiously defend it that they take any critisiscm personally - or perhaps I'm just offensive ;) )
The fact is, I see myself as the kind of person who would be perfectly willing to spend the time fiddling with code, comand line commands and config files to get something to work, hacker style, and indeed I do - but not everyone wants to (or indeed should have to)
I think the manually-editing-config-file problem arises from a lot of people (on this list anyway[1]) use distros that don't have the GUI front ends for doing a lot of system stuff. Personally, I started off with linux on suse 8.0 but moved on to linuxfromscratch after about 5 months. I've never touched mandrake (apart from a quick install to see what it was like) so if someone asked me how to do something on a mandrake system I would have to resort to 'edit this file' since I don't know all the whizzy tools mandrake has to do stuff. Suse is a different matter since I've used yast 2, but I would guess that someone who hasn't used suse as recently as me would be able to suggest a solution for that which didn't involve manually editing files.
Maybe linux geeks should install mandrake once in a while to see how things are done there?
BenE
[1] I would say this is true for 'experienced' linux users everywhere but I don't have any evidence. (and I put 'experienced' in inverted commas because I'm not suggesting that only debian/slackware/etc. users are experienced (was that a bit too PC?))
---------------------------------
Love is all I bring, in me crass t-shirt 'n' ting
BenE wrote:
On Fri, 1 Aug 2003, Ben Francis wrote:
[SNIP]
I think the manually-editing-config-file problem arises from a lot of people (on this list anyway[1]) use distros that don't have the GUI front ends for doing a lot of system stuff. Personally, I started off with linux on suse 8.0 but moved on to linuxfromscratch after about 5 months. I've
[SNIP]
I started seriously using Linux with SuSE 5.3, and progressed through various versions up to 7.3, the last one I installed. As my knowledge increased, my frustration with GUIs and their attendant restrictions grew. The final straw was RPMs and dependencies, and messing about with RPM libraries etc., etc. It simply became too difficult to manage the systems practically, especially with self-compiled software (however many programs SuSE put on their CDs/DVDs there will always be something else one finds and needs, and that'll clash with the stuff the distro needs).
I fell into the arms of Gentoo about a year ago, and have never looked back.
In conclusion, there is and will be a place/case for packaged GUI-based distros on the desktop, (although if I were rolling a lot out I'd build one in G2 and clone it), but the lack of a GUI, and text based config files is a major, major advantage Linux has over GUI-based server systems (accepting that on Linux the GUIs edit text files).
Cheers, Laurie.
Laurie Brown wrote:
I started seriously using Linux with SuSE 5.3, and progressed through various versions up to 7.3, the last one I installed. As my knowledge increased, my frustration with GUIs and their attendant restrictions grew. The final straw was RPMs and dependencies, and messing about with RPM libraries etc., etc. It simply became too difficult to manage the systems practically, especially with self-compiled software (however many programs SuSE put on their CDs/DVDs there will always be something else one finds and needs, and that'll clash with the stuff the distro needs).
I fell into the arms of Gentoo about a year ago, and have never looked back. <<<
I've also been a SuSE user for some years. I too would like very much to try Gentoo, as more than one user has said such good things about it. So when I upgraded the hard drive in my (Windows) Dell laptop I thought it might be a good place to start. I gave it a 10GB W2k partition (yeah, rank cowardice) and left the remaining 50GB for Linux. Stuck the Gentoo CD in the drive and powered up.
First thing it asked was for me to load some network drivers. How? I have two devices, both CardBus; an Edimax 10/100 and an Actiontec 802.11b. Neither of these are listed and there's no indication on either of them what devices they use (Intel, 3Com or what?) so without the correct driver Gentoo told me they weren't supported. That's my progress; fallen at the first hurdle. Much the same thing happened when I tried Knoppix, I seem to recall; the very first question required knowledge I simply didn't have and had no idea where to find or even what question to ask.
Using a GUI-based distro everything comes up sweetness and light with no baffling questions to answer. My impression is there's a huge gulf between the two types of distro. People like me have neither the time nor the patience to trawl endless newsgroups or cryptic manuals trying to figure out what the software is expecting; they just give up and go back to Red Hat / Mandrake / SuSE / Windows. We have work to do and the operating system is pretty much of an irrelevance as long as it runs the applications we need. It'd be nice to learn more about Linux but not if it means stopping work for a month just to even get the distro installed.
-- GT
Hi,
Thanks for all the replies, and after having read back over my original post I feel I must apologise for it's bloated length :)
Maybe linux geeks should install mandrake once in a while to see how things are done there?
BenE
This would be great! But at the end of the day the majority of the "geeks" probably don't have the time to do this. Forgive me if I'm off the mark here, but I feel the secret to success of the open source model is that everything that happens is mutually beneficial for all involved. A point that is stressed in The Cathedral and the Bazaar is that none of the rules or social norms in the hacker culture have really been written down and followed as such, the beauty of the whole system is that it just *works*, often without concious implementation.
However, this does pose some problems. The decentralised peer review process works fantastically to a point, but when a product needs something that is outside the requirements of the peers themselves - it becomes unlikely for it to happen. Because the developers are perfectly capable of using a command line interface, they're unlikely to claim a fault in an application being that it has no intuitive GUI that could be picked up and used by anyone.
Now obviously this generalisation does not apply without exceptions, far from it - or we simply wouldn't have XWindows or desktop environments at all! But to really produce something ergonomic and intuitve we need continual evalution by outsiders, non-geeks if you like. The question is, how do you get people with little deep interest in computers to contribute to a programming project without motivations like say money? And also in doing so, would the development model itself suffer or be held back as a result?
(this is email is going to get horribly long again, I can feel it)
The big challenge for open source software writers going forwards is to go out and actively seek input and advice from graphics designers and similar people as well as the many people for whom the computer is just a piece of office equipment, entertainment centre, etc . People who, basically, couldn't give a toss about the software as such but just want a tool to do the job and will ask the awkward questions like "why can't I just switch it on and use it?".
<snip>
Keith
Again, what is the motivation for these people?
In conclusion, there is and will be a place/case for packaged GUI-based distros on the desktop, (although if I were rolling a lot out I'd build one in G2 and clone it), but the lack of a GUI, and text based config files is a major, major advantage Linux has over GUI-based server systems (accepting that on Linux the GUIs edit text files).
Cheers, Laurie.
You use the word "server" here. If you mean server as in web server, file server, print server... then these are all things that will be used by the geeks themselves - people who will perhaps find their productivity slowed rather than quickened by a GUI interface. If that's not what you mean at all, sorry ;)
I'd also like to stress that what I'm suggesting is definately not that the flexibility of a Linux system be compromised by a GUI, in fact everything about the traditional ways of using Unix and all the command line interfaces and source code et al should remain - or it simply wouldn't be Linux. However, it shouldn't be there by *default* when trying to aim the product at the other 95% of the population who don't mind having it all hidden away behind point and click interfaces. What I'm saying is that there should be an easy, intuitive, graphical *option* for those who want/need it.
Here's what I propose. If you're on this list, and you see a text-based answer to a user's problem, and think "that's new, I would have done it with such-and-such a set of dialog boxes/menus instead" (or vice versa,) don't hold back, post your version as an alternative.
What do people think? Might this work? Or am I talking complete rubbish? re.
Great idea Dan, there's usually much more than one way of solving a problem in Linux and where there are alternatives they should be listed. Of course the risk is that by offering too many options you begin to make the original problem look more complicated than it perhaps is! What do people think on this?
Using a GUI-based distro everything comes up sweetness and light with no baffling questions to answer. My impression is there's a huge gulf between the two types of distro. People like me have neither the time nor the patience to trawl endless newsgroups or cryptic manuals trying to figure out what the software is expecting; they just give up and go back to Red Hat / Mandrake / SuSE / Windows. We have work to do and the operating system is pretty much of an irrelevance as long as it runs the applications we need. It'd be nice to learn more about Linux but not if it means stopping work for a month just to even get the distro installed.
-- GT
I think that this is a really fundamental point. It may be that Linux can solve a problem a million times better than the alternatives, but people like *easy* solutions.
To me, if someone has to ask a lot of questions to simply get something to work in the first place, either a) the documentaton is bad or too long or b) the program itself has not been built with the user enough in mind. Perhaps there's also c) the problem is itself by nature a very complicated one that requires a complicated answer. When this is the case, no matter how many pretty buttons you put on top of something - it's use is still going to be complicated because there are so many variations on the answer to the problem. However, this does *not* mean that we can't do our darned hardest to make it as stress free and systematic as possible - the question is, who are the best people to be doing this? If it's not the people who are traditionally involved in developing software, how do we go about getting an outsider to contribute?
Sorry again for the length of my post.
One other thing - I'm very new to the whole Open Source idea myself, and if I'm missing something here then please don't hesitate to contridict me. My opinions are only based on my own experiences and the experiences of a few people around me - but they *have* been experienced so I see no reason why they shouldn't be discussed.
-- Ben "tola" Francis
Thanks for all the replies, and after having read back over my original post I feel I must apologise for it's bloated length :) <<<
I for one don't have a problem; I think you're making a lot of very good points. So carry on.
However, this does pose some problems. The decentralised peer review process works fantastically to a point, but when a product needs something that is outside the requirements of the peers themselves - it becomes unlikely for it to happen. Because the developers are perfectly capable of using a command line interface, they're unlikely to claim a fault in an application being that it has no intuitive GUI that could be picked up and used by anyone.
I suspect that given the huge number of people out there, all with differing needs, pretty everything that's needed gets written. The problem isn't with the code; it's usually with the documentation. Geeks (on the whole) aren't the most literary-minded people and they don't (again on the whole) much like writing manuals for non-geeks. Many years ago I dicovered that Unix documentation was very useful as a reference but not much help as a user guide. Once I'd been through the pain of discovery, most of the documentation could be summarised as
"So that's what it meant!"
And not a lot has changed. This is as true for Windows as for Linux; does the Windows on-line help really tell you from scratch how to install a network card? Don't think so.
If Linux is to really beat Windows to the totally-non-geek market we need a lot more literate types to get down in the software and describe to the rest of us how it all works. The Linux Documentation Project is good - very good in places - but I feel there's still a long way to go.
-- GT
Graham Trott wrote:
And not a lot has changed. This is as true for Windows as for Linux; does the Windows on-line help really tell you from scratch how to install a network card? Don't think so.
Whilst the GUI interace is heralded as intuitove, the typical GUI application is more of a toolbox where the selection of tools is chosen to be useful for the task for which the application was designed which, unforunately, doesn't help a beginner who has no idea what tools to expect or how to use each of them. This is clearly the reason many Windows applications now include a "Wizard" to guide the user through a task.
Documentation is much the same. Typically the on-line help that comes with a GUI application is a reference guide to the set of available tools, describing each menu, the options on it and the dialog boxes that will pop up. This is exactly equivalent to the Unix man pages which describe each available command and its options. Neither of these help the novice as they don't explain how to do a task.
The Linux Documentation Project is a huge step forward because the HOW- TO documents describe how to do a task. Even if the HOW-TO is not perfect it is a tremendous help, not least because it tells you which tools you are likely to find that would be applicable to the task in question so if you want to do something slightly different from what is described you know what other documentation you need to consult.
Documentation is only part of the picture though because, however good documentation is, people are reluctant to consult it. Perhaps the biggest difference is for an application designer to be able to think like the kind of user who just regards the computer as an appliance, so if the user thinks "Copy CD" don't make the user worry about "Extract CD to file on hard disk, Write CD from file on hard disk".
Much has been made of clever hardware detection and installer technology and yes this is an important area, but it is worth bearing in mind that even setting up windows on some hardware is too complicated for many people, requiring the user to know who made what and downloading extra drivers (or fixed ones). For many users the whole thing is taken care of because Windows comes pre-installed. I am very pleased to see Dell offering Linux pre-installed on some of their servers and I guess there are others too - it would be nice for pre- installed desktop sytstem to be available too.
Finally, we must bear in mind that however much effort we go to, someome will still find things too hard. I am thinking at this point of the father of a guy at work who compained to his son that whenever he turned off his PC it forgot the text in his word processor. It turns out he hadn't discovered the "Save" button and, because he just turned off the PC rather than shutting it down, nothing prompted him to save!
From: Steve Fosdick on Sunday, August 03, 2003 11:46 AM Finally, we must bear in mind that however much effort we go to, someome will still find things too hard.
Someone once said "You can never make anything foolproof because fools are far too ingenious!" (I think it might have ben Edison?)
Keith ____________ Life is painful, suffering is optional. Sylvia Boorstein
Ben Francis wrote:
[SNIP]
You use the word "server" here. If you mean server as in web server, file server, print server... then these are all things that will be used by the geeks themselves - people who will perhaps find their productivity slowed rather than quickened by a GUI interface. If that's not what you mean at all, sorry ;)
[SNIP]
That's exactly what I meant. A server needs a gui like a fish needs a bicycle: it's just an overhead that wastes resources, and many server-based tasks (well, the set-up for a combination of tools) are too complex for a gui to help without it being horribly complex. In fact, I wonder if it's possible at all in some circumstances.
Desktops are a different matter entirely, and I think a GUI positively helps there.
Cheers, Laurie.
On Saturday 02 Aug 2003 11:03 pm, Ben Francis wrote:
However, this does pose some problems. The decentralised peer review process works fantastically to a point, but when a product needs something that is outside the requirements of the peers themselves - it becomes unlikely for it to happen. Because the developers are perfectly capable of using a command line interface, they're unlikely to claim a fault in an application being that it has no intuitive GUI that could be picked up and used by anyone.
This is the roile of the different distros. there are distros for command line lovers and one for GUI lovers. Distros like Mandrake and Lindows for example aim to do preciselt what you ask. The best thing about linux is that there is always a choice.
Ian
On Sun, 3 Aug 2003, Ben Francis wrote:
Here's what I propose. If you're on this list, and you see a text-based answer to a user's problem, and think "that's new, I would have done it with such-and-such a set of dialog boxes/menus instead" (or vice versa,) don't hold back, post your version as an alternative.
What do people think? Might this work? Or am I talking complete rubbish?
Great idea Dan, there's usually much more than one way of solving a problem in Linux and where there are alternatives they should be listed. Of course the risk is that by offering too many options you begin to make the original problem look more complicated than it perhaps is! What do people think on this?
In Debian 2.2, when I typed "linuxconf" in a terminal window, I used to get a pretty GUI that did lots of Linux administration tasks. I never used it much (I'm one of the unhelpful people who prefer text config files,) so can't vouch for its functionality, but I figured it might be something like what you wanted. Unfortunately, although the manpage still claims this is available with "linuxconf --gui," I can't get anything but a text (ncurses) interface to it, in Debian 3.0.
From: Ben Francis on Saturday, August 02, 2003 11:04 PM
The big challenge for open source software writers going forwards is to go out and actively seek input and advice from graphics designers and similar people as well as the many people for whom the computer is just a piece of office equipment, entertainment centre, etc . People who, basically, couldn't give a toss about the software as such but just want a tool to do the job and will ask the awkward questions like "why can't I just switch it on and use it?".
<snip>
Keith
Again, what is the motivation for these people?
Aye, there's the rub! Motivation.
As many posts have pointed out, the majority of Open source software is written by people who are scratching their own personal itches of various kinds. They have a problem and they solve it in a manner that works for them. Up until now most of the software writers have been enthusiasts that are happy with command-line and text config file solutions. I don't see this changing anytime soon and I think it would be wrong (immoral actually) to suggest any sort of pressure should be put on people to write software in any particular way.
My hope is that, as the take up of Open Source software spreads outside of the original community, there will be people who become involved who are motivated to make it more accessible to others. Rather than creating software that extends the underlying functionality they might prefer to encapsulate or package the work of others in ways that makes it more accessible to non-technically minded people who just want it to work. A sort of 2nd generation Open Source movement if you will. I think this is starting to happen with things like Knoppix, Morphix, Mandrake, etc.
Regards,
Keith ____________ The material thing before you, that is IT. Huang-po
On Mon, 4 Aug 2003 10:02:02 +0100 "Keith Watson" kpwatson@ukfsn.org wrote:
they might prefer to encapsulate or package the work of others in ways that makes it more accessible to non-technically minded people who just want it to work. A sort of 2nd generation Open Source movement if you will. I think this is starting to happen with things like Knoppix, Morphix, Mandrake, etc.
Regards,
Keith
I think thats where I came in to Linux, with KDE beta's I finally felt I could move forward into Linux rather than just have it as a learning environment for C/C++. Now Gnome and KDE have made it to mature(ish) desktops I believe they are passable as desktop systems for non technical users provided they have experienced sysadmins administering the machines.
I must admit prolonged exposure to Linux has turned me into a command line fan.
I still use X though and recommend these GUI's
icewm sylpheed rox-filer gimp nedit mozilla dillo abiword alsaplayer xmms grip gv gqview
but my favorite GUI of all is Xterm, all the power of a terminal but stretchable
I don't use them all very often but they are good when I use them.
Linux is getting very acceptable and many free software projects are now showing serious signs they will take over the desktop.
One of the strengths of Linux is its not exclusively targeted at people who have money to spend on Computers. This means that until very recently I still had a 200MHz CPU as my main box at home, and found my choice of applications growing with time rather than shrinking as they did not support old hardware. The impact of Linux is I predict lowest in our rich first world countries. I am just surprised at the bold European direction Linux is taking, eg NGO's and across Government.
Regards
Owen Synge
PS is that the same Keith I used to work with at Kewill? If so where are you working now?
On Fri, Aug 01, 2003 at 11:25:10PM +0100, BenE wrote:
Maybe linux geeks should install mandrake once in a while to see how things are done there?
The only times I ever installed mandrake it gave me a horribly broken system that didn't work, I even purchased a box set once upon a time when a newbie to help me get along and they had managed to ship a CD with broken init scripts. Their fix was to download 4 rpms and install them at the command line, that didn't exactly fill me with confidence.
Anyhow quick question for everyone "do you find the windows gui intuitive?" because it seems to me that people have complaints about linux etc. (some of the problems are of course valid) but i find that the "market leader" is a stinking pile of doggy doo when it comes to the gui. After watching my dad trying to do simple tasks (he wanted to print some .jpg photos that I took with my digicam from a CD and make them fill an a4 sheet and not look bad) on a windows box the other day and really struggling I think the problems people have may not just be a linux problem. Also don't get me started on Mac OS.... ;)
Adam
On Sunday 03 Aug 2003 8:37 am, abower@thebowery.co.uk wrote:
On Fri, Aug 01, 2003 at 11:25:10PM +0100, BenE wrote:
Maybe linux geeks should install mandrake once in a while to see how things are done there?
The only times I ever installed mandrake it gave me a horribly broken system that didn't work, I even purchased a box set once upon a time when a newbie to help me get along and they had managed to ship a CD with broken init scripts. Their fix was to download 4 rpms and install them at the command line, that didn't exactly fill me with confidence.
I used Mandrake from 7.0 to 7.2, abandoned it in favour of Red Hat 7.2 when mandrake reached 8.0 and now have gone back to Mandrake now it has reached 9.1
Over the years I have tried many distroa and the installation grief has varied enormously. However, in general hey are getting better as time goes on. I recently neede to install win2k on my linux laptop and it was nothing but trouble. Even after installing service pack 3 it still only boots about one time in five.
Ian
On Fri, 01 Aug 2003 16:01:25 +0200 Ben Francis ben@franci5.fsnet.co.uk wrote:
[about how the current state of Linux doesn't make it easily accessible to a lot of people]
Forgive me for summarising your post Ben but I wanted to keep my comments as short as possible and I've all sorts of ideas sloshing round in my head.
I agree with you 100%, both about the current state of Linux and the Cathedral and the Bazaar :o)
I use Linux for many reasons but primarily because I'm a geek/techie/whatever. I like the fact I can get the source and tinker with it. Most of the people I meet are motivated similarly. Not all of them it's true but then not all generalisations are correct (not even this one! :o) )
Which means that most of the software is designed and built by technically minded software enthusiasts who enjoy the intellectual challenge and stimulation it affords.
However, after 25+ years designing and building software, it's my experience that I and my peers are not very adept at creating GUI's (or HCI's or HMI's) or whatever you want to call them. This becomes apparent to me on the rare occasions I work on a project that's perceptive enough to employ a specialist to do this work (the best one's seem to be graphics designers with an interest in computers but not software per se). In these cases the level of acceptance and commitment from the end-users of the final products is noticeably higher (and often even when I feel the underlying software engineering is somewhat lacking in quality this seems to be true).
Which means that, in their current form most GNU/Linux projects are technically excellent but are often "as user friendly as a cornered rat". However, over the past few years I do think there's been a gradual but noticeable improvement so perhaps a cornered hamster might be a better analogy. :o)
The references to the Mac are very apt. I've often been struck by how enthusiastic people who've used Macs are about them. The interface neatly encapsulates the underlying complexity and allows them to get on with using it as a tool. Just as you don't have to be an electronic enthusiast to use a home hi-fi system or television, or a mechanical specialist to drive a car.
I think a lot of software designers and writers fear that a simple and friendly user interface means that the underlying software engineering will have to be simplified and that they will have to compromise functionality or design integrity (to the extent where even to use the words "user friendly software" evokes a hostile reaction from some).
But this is not true. We do it all the time when creating software. We deliberately create object models to encapsulate and hide complexity and present a simpler interface to other software components. There's no suggestion (well not that I've heard - yet) that in doing this we're sacrificing any engineering quality.
The big challenge for open source software writers going forwards is to go out and actively seek input and advice from graphics designers and similar people as well as the many people for whom the computer is just a piece of office equipment, entertainment centre, etc . People who, basically, couldn't give a toss about the software as such but just want a tool to do the job and will ask the awkward questions like "why can't I just switch it on and use it?".
So if you've offended some people Ben perhaps it's no bad thing. Someone once said "Sometimes our job is to oil the wheels so that things work smoothly but sometimes we need to increase friction until traction takes place" (I think it was Sir John Harvey-Jones).
If you think you have a valid point (and I think you do) then don't worry about upsetting a few people. Another quote, George Bernard Shaw said "Reasonable people adapt to the world; unreasonable people persist in trying to adapt the world to themselves. Therefore all progress depends on unreasonable people." After all where would we be if RMS was a more reasonable person (tongue firmly in cheek! :o) )
Keith
On Fri, 1 Aug 2003, Ben Francis wrote:
The fact is, I see myself as the kind of person who would be perfectly willing to spend the time fiddling with code, command line commands and config files to get something to work, hacker style, and indeed I do - but not everyone wants to (or indeed should have to) and my frustrations stem from trying to simultaneously look at the problem from a completely non-technical point of view. Although I have a background in ergonomics and graphic design in my academic studies as well as Computing, Science and Maths, I don't see myself as the kind of person who would be able to *solve* the user-friendly problems involved.
I'll give my standard apology that, not having been in the IRC discussion, I might be going over old ground. I've recently discussed some of these issues off-list with another ALUG member, and have what may or may not be some useful insights.
I agree with the principle that it's great to have GUIs available for those users who have a personal preference for them (and equally great to have command lines and text config files for those users who prefer to do things this way.) However, I'd be rather surprised if your end users needed to do anything for which there was no GUI already available in Linux distros. (I wouldn't be quite so surprised if you as the sysadmin were occasionally left with only command line/text config file solutions for some of your tasks, but if I've read you aright, you don't mind.) I'm going to hazard a guess that the reason you hear about the non-GUI solutions for end user problems, when you ask in ALUG, is that the people answering have a personal preference for command-line/text config solutions, and therefore this is what they've (we've) practiced, and what they (we) know how to explain. Once again, it doesn't necessarily mean the GUI solutions don't exist.
Here's what I propose. If you're on this list, and you see a text-based answer to a user's problem, and think "that's new, I would have done it with such-and-such a set of dialog boxes/menus instead" (or vice versa,) don't hold back, post your version as an alternative.
What do people think? Might this work? Or am I talking complete rubbish?
Of course, sometimes there really won't be a GUI. Then someone's got to write one ;-), if people want to use it. When I was discussing this stuff recently, I came up with the (possibly rather cryptic) assertion "In the absence (mostly) of the profit motive, you have to find more subtle ways of getting the community to respond to customer demand." I understand you've just read _The Cathedral and the Bazaar_ (as have I,) and this might give you some idea what these ways are.
Dan Hatton wrote:
[SNIP]
Of course, sometimes there really won't be a GUI. Then someone's got to write one ;-), if people want to use it. When I was discussing
A recent case in point... The other day I set up* a mail system using postfix, sasl, courier imap and pop3, certificates, authenticating virtual users against a mysql database, tied in with amavisd, with sohpos/nai AV and spamassassin, and offering squirrelmail. That'd be some GUI!
Cheers, Laurie.
* a non-trivial task, I might add...
Dan Hatton dan.hatton@btinternet.com wrote:
[...] non-GUI solutions for end user problems, when you ask in ALUG, is that the people answering have a personal preference for command-line/text config solutions [...]
I'm not sure that's the whole story. Other reasons are: 1. Easier to explain "comment out line saying foo in file bar" than "move mouse to this, click that, set foo and click OK" in text email and we don't always have time to lovingly prepare web pages with screenshots for list queries. 2. Often people asking questions don't even say which GUI they use... Mandrake has a control centre, Gnome has one, GNUstep has Preferences.app and so on. Nothing wrong with that -- it just takes some getting used to. (Debian's reportbug tool adds some info automatically because of this sort of problem.) 3. Even if they do, they're more likely to find someone who knows the underlying solution by text files rather than their particular GUI and so say that. Nothing wrong with that -- it's just an indication of ALUG's age and that the text files have been around longer. 4. ...and sometimes the question is a too specific "how do I do X in app Y" when they really mean "how do others do task X".
(or vice versa,) don't hold back, post your version as an alternative.
Oh yes, the more ways to skin the moggies the better.
On Sunday 03 Aug 2003 9:26 pm, MJ Ray wrote:
Dan Hatton dan.hatton@btinternet.com wrote:
[...] non-GUI solutions for end user problems, when you ask in ALUG, is that the people answering have a personal preference for command-line/text config solutions [...]
I'm not sure that's the whole story. Other reasons are:
- Easier to explain "comment out line saying foo in file bar" than
"move mouse to this, click that, set foo and click OK" in text email and we don't always have time to lovingly prepare web pages with screenshots for list queries.
And often a text based solution is the only one. There is a perennial problem with PCMCIA config.opts file and Dell laptops which can only be fixed by editing the file (as root).
Ian
As Adam says, Booster charges are fine. A lithium Ion pack should be good for about 3-400 cycles but any change from discharge to charge state counts as a cycle, so while booster charges don't actually shorten the life of the pack they do knock one cycle off the batteries life. So in terms of total runtime from the pack you will get a shorter life with boost charges, but in terms of number of charges you wont.
Another tip is that you should try and avoid leaving the pack in a discharged (i.e below 20% capacity remaining) state for long periods of time. If you are storing a pack for an extended period of time (say it's a spare) it's worth charging it once every 3 months or so.
The reason for this is that Lithium Ion (like many other battery technologies) has a self discharge behaviour. Now this is not a problem if using a single cell. But in a pack of series connected cells (like in your laptop) if one cell dips down to a level below that of the other cells, when you try to use the pack the cell gets reverse charged by the current flow through the series circuit. This is called cell reversal and it's not good for them.
On Friday 01 August 2003 07:29, syd wrote:
I've just bought a T-series Thinkpad. It has Lithium-Ion batteries. AFAIR these do not have the 'memory' effect so does that really mean in practice that it is perfectly OK to give short booster charges no matter what the state of charge/discharge?
** Wayne Stallwood wayne.stallwood@btinternet.com [2003-08-01 09:37]:
On Friday 01 August 2003 07:29, syd wrote:
I've just bought a T-series Thinkpad. It has Lithium-Ion batteries. AFAIR these do not have the 'memory' effect so does that really mean in practice that it is perfectly OK to give short booster charges no matter what the state of charge/discharge?
As Adam says, Booster charges are fine. A lithium Ion pack should be good for about 3-400 cycles but any change from discharge to charge state counts as a cycle, so while booster charges don't actually shorten the life of the pack they do knock one cycle off the batteries life. So in terms of total runtime from the pack you will get a shorter life with boost charges, but in terms of number of charges you wont.
Another tip is that you should try and avoid leaving the pack in a discharged (i.e below 20% capacity remaining) state for long periods of time. If you are storing a pack for an extended period of time (say it's a spare) it's worth charging it once every 3 months or so.
The reason for this is that Lithium Ion (like many other battery technologies) has a self discharge behaviour. Now this is not a problem if using a single cell. But in a pack of series connected cells (like in your laptop) if one cell dips down to a level below that of the other cells, when you try to use the pack the cell gets reverse charged by the current flow through the series circuit. This is called cell reversal and it's not good for them.
** end quote [Wayne Stallwood]
OK, I'm a bit slow on this thread as I've not had a chance to keep up with the list, but is 3-400 cycles typical with a laptop battery? I've been told this by Dell now that my two batteries have given up and was quite disgusted. Traditional Lithium Ion batteries are quoted at around 1000 cycles iirc. That's not a particularly good value for money when a battery costs around 77 quid plus VAT and delivery for my Dell (or about 249 quid I've seen quoted for for one). That's not cheap to keep my laptop going when I have to buy a new one each year - on current evidence. That coupled with the fragility of the power connector for the charger has put me right off Dell kit which I always used to think highly of (oh, and an increasingly dodgy connection to the screen).
On Tuesday 05 August 2003 22:14, Paul Tansom wrote:
is 3-400 cycles typical with a laptop battery? I've been told this by Dell now that my two batteries have given up and was quite disgusted. Traditional Lithium Ion batteries are quoted at around 1000 cycles iirc. That's not a particularly good value for money when a battery costs around 77 quid plus VAT and delivery for my Dell (or about 249 quid I've seen quoted for for one). That's not cheap to keep my laptop going when I have to buy a new one each year - on current evidence. That coupled with the fragility of the power connector for the charger has put me right off Dell kit which I always used to think highly of (oh, and an increasingly dodgy connection to the screen).
Seems pretty typical in my experience.
The 1000 cycle thing is assuming ideal circumstances. Things like current drawn and the operating temperature of the Battery can effect the assumed life of any battery technology, as does the design of the charger.
For example in my youth I used to race scale model radio controlled cars, your batteries could win or lose you a race so serious competitors bought the very best and matched the cells carefully. If I got 30 races out of a £50 pack I thought I was doing well. But then the batteries were being used under harsh conditions (heat and charge/discharge rate) and at the time we were only allowed to use NiCd technology.
As to Dell laptops, I too am rarely impressed by them. The price/performance ratio isn't too bad but IMO the build quality of the machines leaves a lot to be desired. In my previous job we had a pool of 10 Inspiron A400's, after 18 months the only one working properly was the one I completely rebuilt after a Coke spill. It was not unusual for me to have to go around every 6 months tightening case screws that were nearly falling out, replacing floppy screen hinges, dead batteries, broken keyboards.
IMO Sony's aren't too bad, Toshiba's are good, Compaq/HP have improved a great deal and are currently my favourite, Thinkpads (apart from the Acer built ones) are king (at least for X86 laptops anyway).
Mail me off list if you want as I can get Laptop spares and I may be able to source you a new screen connector (depending on the model it's reasonably easy to fit)
On Wed, Aug 06, 2003 at 09:14:02AM +0000, Wayne Stallwood wrote:
As to Dell laptops, I too am rarely impressed by them. The price/performance ratio isn't too bad but IMO the build quality of the machines leaves a lot to be desired. In my previous job we had a pool of 10 Inspiron A400's, after 18 months the only one working properly was the one I completely rebuilt after a Coke spill. It was not unusual for me to have to go around every 6 months tightening case screws that were nearly falling out, replacing floppy screen hinges, dead batteries, broken keyboards.
IMO Sony's aren't too bad, Toshiba's are good, Compaq/HP have improved a great deal and are currently my favourite, Thinkpads (apart from the Acer built ones) are king (at least for X86 laptops anyway).
Urgh! Sony suck, if your laptop breaks in warranty (very likely from my experience) put it in the bin as their warranty is worth nothing at all they send them off to Belgium and if you are lucky you will see your laptop in around ~3 months. Toshiba are not to bad on the warranty front but unfortunatly you end up having to return them every 3 weeks because something else broke at least my experience of them was this.
I feel that Dell are about as bad as other laptops (ie likely to break often) but at least it seems all new Dell laptops come with a 3 yr onsite warranty.
I also agree that Thinkpads are the bees knees of x86 laptops, although most people say "its 40% more than the equivalent *" but don't realise the benefits you get from them, things like stocking parts for years after they finish production is the first that comes to mind.
The main thing to remember if you do buy a Thinkpad is not to install lm-sensors or run sensors-detect until you are certain it will not kill the laptop instantly.
Adam
On Wednesday, August 6, 2003, at 09:29 AM, abower@thebowery.co.uk wrote:
Urgh! Sony suck, if your laptop breaks in warranty (very likely from my experience) put it in the bin as their warranty is worth nothing at all they send them off to Belgium and if you are lucky you will see your laptop in around ~3 months. Toshiba are not to bad on the warranty front but unfortunatly you end up having to return them every 3 weeks because something else broke at least my experience of them was this.
My housemate owns a sony laptop and guess what? It broke a day after the warranty finished. Now he got a broken laptop, sony won't help and most of all. He just paid off the laptop.
Right now.. it's sunny. Let's go outside ;)
C
From: Adam on Wednesday, August 06, 2003 9:30 AM
The main thing to remember if you do buy a Thinkpad is not to install lm-sensors or run sensors-detect until you are certain it will not kill the laptop instantly.
presumably lm-sensors/sensors-detect are some sort of software package? What are they designed to do and why/how do they kill a Thinkpad?
Regards,
Keith ____________ Has anyone attained wisdom by pondering the experience of others? Not since the world began, they must pass through the fire. Norman Douglas
On Wed, Aug 06, 2003 at 09:56:36AM +0100, Keith Watson wrote:
From: Adam on Wednesday, August 06, 2003 9:30 AM
The main thing to remember if you do buy a Thinkpad is not to install lm-sensors or run sensors-detect until you are certain it will not kill the laptop instantly.
presumably lm-sensors/sensors-detect are some sort of software package? What are they designed to do and why/how do they kill a Thinkpad?
lm-sensors is a package that contains kernel modules and some utilities for monitoring fan speed/status and temperature using the sensors on your motherboard. Almost essential in this weather if you own an Athlon machine.
It kills a thinkpad by probing an area that is on the i2c bus but somehow confuses the sensor (apparently in violation of the specs) which will break the laptop. I was just reading that lm-sensors isn't useful on thinkpads anyway because it won't work due to IBM breaking things in a "special" way.
more at http://www.linux-thinkpad.org/ and http://secure.netroedge.com/~lm78/ oh and before thinkpad owners flame me with the exact details I know I havn't explained the full situation but then I don't own a thinkpad and I hope people would rather be safe than sorry.
Adam
On Wednesday, Aug 6, 2003, at 09:29 Europe/London, abower@thebowery.co.uk wrote:
Urgh! Sony suck, if your laptop breaks in warranty (very likely from my experience) put it in the bin as their warranty is worth nothing at all they send them off to Belgium and if you are lucky you will see your laptop in around ~3 months.
My experience with Sony is also not good. I bought what was apparently the last replacement battery they have in Europe for a 2.5-year-old 266 MHz picturebook and it cost about £150. Far too expensive.
Toshiba are not to bad on the warranty front but unfortunatly you end up having to return them every 3 weeks because something else broke at least my experience of them was this.
My only bad experience with Toshiba was that they wouldn't service my Libretto 60, which I'd milled the bottom out of to take a larger disk. ;-) But that was mostly because I'd imported the machine from Japan. On the plus side, their service centre was (is?) in Chelmsford.
I feel that Dell are about as bad as other laptops (ie likely to break often) but at least it seems all new Dell laptops come with a 3 yr onsite warranty.
My Dell *desktop* machine is still running, 7 years on. But their laptops don't impress me -- though two of my colleagues have been using Dells for a year or two without any problems, even with their batteries.
I also agree that Thinkpads are the bees knees of x86 laptops, although most people say "its 40% more than the equivalent *" but don't realise the benefits you get from them, things like stocking parts for years after they finish production is the first that comes to mind.
I concur. My 486-based 750C still runs fine about 10 years on, though the NiCd battery is pretty well shot now. (This machine is of historical significance, as it was used in the early days of porting linux to the TP.) My one experience with IBM's support (Welwyn Garden City) was also positive.
On the same note, I might say that my Fijitsu-Siemens laptop's Li battery was totally shot after about 9 months -- which is a pity as it was otherwise an excellent little machine. So it seems there are different qualities of Li battery, as my usage pattern (mostly plugged in) doesn't vary much.
My Apple powerbook looks to be doing well, with ~3.5 hours battery life; but we'll see how well it does after 9 more months' abuse from me.
..Adrian
On Wed, Aug 06, 2003 at 10:33:50AM +0100, Adrian F. Clark wrote:
On Wednesday, Aug 6, 2003, at 09:29 Europe/London, abower@thebowery.co.uk wrote:
Urgh! Sony suck, if your laptop breaks in warranty (very likely from my experience) put it in the bin as their warranty is worth nothing at all they send them off to Belgium and if you are lucky you will see your laptop in around ~3 months.
My experience with Sony is also not good. I bought what was apparently the last replacement battery they have in Europe for a 2.5-year-old 266 MHz picturebook and it cost about £150. Far too expensive.
Ouch! I needed a replacement battery for my picturebook (typical dead cells problem, happens to picturebooks more than anything else in the sony range it seems) and got one from www.psaparts.co.uk (I think, will have to check when i get home) it isn't an official sony battery and is twice the capacity of the standard sony. I also cost me about half the price of just the standard "official" battery not even the "official" dual capacity one. I don't think I would have actually paid for an "official" battery considering how many problems people have with them, it being 3rd party was a bonus.
Oh the other thing that died on my picturebook was the hard disk, many people with the C1-VE had this problem and when they sent them off to sony for replacement they replaced the disks with a different manufacturers, of course my disk mush have died about a month out of warranty :(
I just wish there were more manufacturers of tiny laptops around, although my picturebook has been serving me since late 2000 now and I don't see any reason to replace it for another couple of years yet. (although it could do with more RAM)
Adam
** abower@thebowery.co.uk abower@thebowery.co.uk [2003-08-06 10:49]:
On Wed, Aug 06, 2003 at 10:33:50AM +0100, Adrian F. Clark wrote:
My experience with Sony is also not good. I bought what was apparently the last replacement battery they have in Europe for a 2.5-year-old 266 MHz picturebook and it cost about ?150. Far too expensive.
Ouch! I needed a replacement battery for my picturebook (typical dead cells problem, happens to picturebooks more than anything else in the sony range it seems) and got one from www.psaparts.co.uk (I think, will have to check when i get home) it isn't an official sony battery and is twice the capacity of the standard sony. I also cost me about half the price of just the standard "official" battery not even the "official" dual capacity one. I don't think I would have actually paid for an "official" battery considering how many problems people have with them, it being 3rd party was a bonus.
They obviously have a better margin to work with on Sony parts. When I looked some months back they were actuall more expensive for my Dell battery, and now they are all of 2 quid cheaper. I may try to contact them about my PSU issue though as they seem to do a generic one with adapters. It's the same price as a genuine Dell unit, but they may be able to sell me some adapters on their own if I'm lucky as that's all that's broken (the connecter). Currently I've solderd a pair of wires between the two broken halves of the connecter and all is working fine - just doesn't look too good!!!
** end quote [abower@thebowery.co.uk]
BTW, from your other reply, I'm not sure as I'd consider an IBM at 40% less than standard prices. It's obviously prejudice by now as the technicallities must have changed, but I keep seeing them on deals from Morgan and another second user/refurbished laptop supplier, but have never been tempted enough to even find out if the price is good or not.
It's odd really, as my first personally owned PC compatible was an IBM laptop which is still going strong (apart from the batteries, but I can forgive that!). A 386sx20 based L40 - nice keyboard, but the tiniest greyscale screen ever! When I first followed Linux it was with a view to installing it on that machine back in the early 90's on an IBM internal forum dedicated to Linux on L40's - there's even a patch still in the kernel for the floppy drive on them iirc! Now if I'd actually managed to get some install media and try it I'd probably be a guru by now, but I stuck with OS/2 betas and didn't take the Linux plunge until around '96 or 7 iirc with Caldera OpenLinux (spit, spit, spit - excuse me while I choke remembering how I preferred it to Red Hat at the time!).
Since everyone else is getting their 2 cents-worth, can I say that my three-year-old Dell Inspiron 5000 has served me well so far. The battery gave out earlier this year, and yes, it cost a fortune to replace, but only then did I read in the manual about disconnecting the mains when not in use to preserve the battery. Well, who reads manuals anyway?
-- GT
** Graham Trott gt@pobox.com [2003-08-06 14:41]:
Since everyone else is getting their 2 cents-worth, can I say that my three-year-old Dell Inspiron 5000 has served me well so far. The battery gave out earlier this year, and yes, it cost a fortune to replace, but only then did I read in the manual about disconnecting the mains when not in use to preserve the battery. Well, who reads manuals anyway?
** end quote [Graham Trott]
Interesting, I read through the manual with mine carefully for battery care, but it didn't mention anything about it. Two manuals with 10 pages of English in one and 15 in the other - all how to set it up and use the hibernate functions etc.. I have tended to charge the batter and forget to unplug it when finished - hmm.
** Wayne Stallwood wayne.stallwood@btinternet.com [2003-08-06 09:18]:
On Tuesday 05 August 2003 22:14, Paul Tansom wrote:
is 3-400 cycles typical with a laptop battery? I've been told this by Dell now that my two batteries have given up and was quite disgusted. Traditional Lithium Ion batteries are quoted at around 1000 cycles iirc. That's not a particularly good value for money when a battery costs around 77 quid plus VAT and delivery for my Dell (or about 249 quid I've seen quoted for for one). That's not cheap to keep my laptop going when I have to buy a new one each year - on current evidence. That coupled with the fragility of the power connector for the charger has put me right off Dell kit which I always used to think highly of (oh, and an increasingly dodgy connection to the screen).
Seems pretty typical in my experience.
The 1000 cycle thing is assuming ideal circumstances. Things like current drawn and the operating temperature of the Battery can effect the assumed life of any battery technology, as does the design of the charger.
For example in my youth I used to race scale model radio controlled cars, your batteries could win or lose you a race so serious competitors bought the very best and matched the cells carefully. If I got 30 races out of a ?50 pack I thought I was doing well. But then the batteries were being used under harsh conditions (heat and charge/discharge rate) and at the time we were only allowed to use NiCd technology.
I used to race radio controlled yachts myself which didn't have the same battery drain! Nothing 'scaled' about them though, with the sail areas, keel depth, etc. scaled up they'd make the extremese of the Americas Cup yachts look rather normal! Racing machines through and through. I haven't raced for years now though, and things have gotten even more serious now with Kevlar sails and carbon fibre hulls for maximum performance - sort of priced me out the game even if I had the time!
As to Dell laptops, I too am rarely impressed by them. The price/performance ratio isn't too bad but IMO the build quality of the machines leaves a lot to be desired. In my previous job we had a pool of 10 Inspiron A400's, after 18 months the only one working properly was the one I completely rebuilt after a Coke spill. It was not unusual for me to have to go around every 6 months tightening case screws that were nearly falling out, replacing floppy screen hinges, dead batteries, broken keyboards.
I went round their factory in Ireland many years back and was much impressed by the processes they used - far better than the now defunct AST factory I also saw.
IMO Sony's aren't too bad, Toshiba's are good, Compaq/HP have improved a great deal and are currently my favourite, Thinkpads (apart from the Acer built ones) are king (at least for X86 laptops anyway).
Having worked for IBM with many Thinkpads I thought the best way to use them was without software of any kind ;-) I hate them with avengence having spent some years evaluating modems, network cards and the like to find out whether they acually worked with them - and having a 'Thinkpad certified' logo was no guarantee either. The best OS to use with them was OS/2 in most cases - Windows 3.1 and 95 (shows how long ago it was - 486 and early Pentium machines) would have screen driver problems and there was a definate art in any PCMCIA card used with them. If you could find the right incantation you'd get it to work, otherwise send it back to the supplier. The next unit of the same model with the same card would have a whole new set of problems, even with the same clean install process. Trouble is I got a reputation for being one of only two people on the site that had much success installing them, so I spent far to much time with them, even to the extent of dual, triple and quad boots with Win31, Win95, WinNT35 and OS/2.
Mail me off list if you want as I can get Laptop spares and I may be able to source you a new screen connector (depending on the model it's reasonably easy to fit)
I may do, I'll need to work out what is causing the problem first - probably connector as a sharp thump in the right place usually works. The main way I notice it is that the writing in my Putty session becomes unreadable - the rest of the Windows (spit) screen is fine though. It's probably Windows 2000 realising I'm spending most of my time in a shell to a Linux box and objecting!
** end quote [Wayne Stallwood]
On Wednesday 06 August 2003 22:35, Paul Tansom wrote:
far better than the now defunct AST factory I also saw.
My 2nd laptop was an AST and I loved that little 486. It was tiny (Apple Powerbook 12" G4 size) and had a trackball for cursor movements. No CD Rom and a Passive Matrix screen. But it never let me down and I only got rid of it about 3 years ago !
I guess it's all down to your last experience of any make, I've had bad ones with Dell and others, but my Compaq 1015v is fantastic (and cheap).
Also remember that the base machines are built all over the pace (regardless of the badge on the front) The name brand manufacturers jump between carcass manufactures all the time so quality varies between different models of the same make.
** Wayne Stallwood wayne.stallwood@btinternet.com [2003-08-07 00:52]:
On Wednesday 06 August 2003 22:35, Paul Tansom wrote:
far better than the now defunct AST factory I also saw.
My 2nd laptop was an AST and I loved that little 486. It was tiny (Apple Powerbook 12" G4 size) and had a trackball for cursor movements. No CD Rom and a Passive Matrix screen. But it never let me down and I only got rid of it about 3 years ago !
I guess it's all down to your last experience of any make, I've had bad ones with Dell and others, but my Compaq 1015v is fantastic (and cheap).
Also remember that the base machines are built all over the pace (regardless of the badge on the front) The name brand manufacturers jump between carcass manufactures all the time so quality varies between different models of the same make.
** end quote [Wayne Stallwood]
I'm not actually going on experience of owning an AST machine, just looking around their factory. What I saw there was not professional - coke cans sitting on the shelf above the PC build area, no sign of anti-static precautions being used; rubbish lying around on the benches (sweet wrappers, etc.); and staff who didn't seem to really care. I wouldn't have been impressed at a local computer shop, but at a manufacturing plant it was worrying.
On the other hand, the Dell factory had custom trays for moving machines around on (shapped with anti-static foam); full anti-static precautions; build papers with each machine; a no eating policy on the build floor; test areas for the build process (I don't remember a section for testing at AST, but it is a good few years ago now).
I've not seen round any other factories to comment. On machine usage I've found Compaq pretty good (although expensive, and only on server experience); HP, OK, but quirky on the desktop (although that may have been NT4!); Toshiba desktops look nice, but not good for the office environment (nice little flap means memory can be removed in seconds, which is not good in an open office environment!); Dell have been solid apart from my laptop and a Quantum HD in my parents machine (which probably shows a drop in quality from the PII or PIII era onwards as everything else I have dates back to Pentiums and is still going strong - I'm just about to take my last Dell P90 out of service due to upgrading the hardware on my firewall; I currenty have 6 pentiums and 3 486s still fully funcional (one P200 still in a every day use as a desktop until my in-laws upgrade).