On 11/1/05, Ashley T. Howes, Ph.D. lists@ashleyhowes.com wrote:
I'm looking for some advice and hints on how to go about installing and managing a large group of desktop machines, with shared file stores and resources such as applications, printers, etc. By large, I mean more than 500 machines. Think of a large university lab or internet cafe, where people can login to any machine, do their 'work', access their files independent of terminal used, etc.
For such a big job, the consultant must be getting quite a large fee ...
Security. I would assume standard unix security would stop one user fiddling with another's files, but how does that work over a network file store?
Just the same as locally.
Am I correct to assume that it is best to have applications installed locally on the machines given the number of machines?
Do you want to reduce network traffic, or do you want absolute central control of the applications available?
Is there a way to sandbox the user, so they don't enter parts of the system they aren't supposed to? I know that there is the normal root access restriction, but what about forcing them to use their network file store, rather than leaving stuff in /tmp.
Mount /tmp from a different partition to /, and then wipe /tmp when users log out. Too many applications depend on a writable tmp directory to mess with that.
[Norfolk Library Services] seems to run a reduced version of Windows that limits what the user can and can't do. Does anyone know how that actually works?
Read up about Windows Policies. http://www.google.co.uk/search?hl=en&q=windows+policies
Good luck!
Tim.