Best Web server program for a lot of static files?
Dear Nerds:
We have a new computer to serve the photo.net photo database. We thought we might do something intelligent, but we ended up instead buying a machine that will carry us forward another 6 months without us having to think. Here are the specs on the new server, which arrived yesterday from Silicon Mechanics:
- 2 Opteron 2212 2 GHz CPUs (each of which is dual core, so effectively I think that means that four threads can be running simultaneously)
- 4 GB RAM
- hardware RAID of 8 750 GB Seagate SATA drives
- CentOS 64-bit operating system
We need to serve a continuous stream of photos from this machine. The data are static and in the local file system. The current load is 2.5 million JPEGs per day, of which 1.2 million are small thumbnail images. We don’t need to query the relational database management system or do anything fancy, just serve the files via HTTP. So maybe, after 12 years, it is time to look beyond AOLserver! Should we consider lighttpd? Apache 2.0? Squid? Should we run just one process of any of these and let threads handle the multiple clients from a single Unix process? Or run multiple copies of the Web server program and tell our load balancer that two sources of these files are available?
[Oh yes, and what about the file system block size? The thumbnails are around 10k bytes.]
Assistance via email or comments here would be appreciated!
Thanks,
Philip
Full post, including comments