According to Carlos Perez eBay serves ~400M page views per day on a database of 60 million auctions with 30k lines of code changes per week with 99.92% uptime. According to eBay’s 10-k, it has 41.2M active users and is growing at 50% per year! The 10-k is ambiguous but it looks like eBay depreciated at least $21M of equipment last year. I think eBay is paying WAYYY too much for server ops.
A modern CPU can handle ~10k HTTP requests per second (see these benchmarks. I don’t know how much memory eBay’s database consumes, but we can make some estimates. Lets assume each auction and each user requires 1k of memory. 60M auctions plus 40M users implies 100GB of memory. Now lets double that for indexing and other overhead and we are at 200GB.
Today you can buy a Sun E25K for $3.6M with 72 processors and up to 500GB of memory. That seems like enough to handle eBay’s load. Potential problems
- Need to save to disk: Use Prevayler style architecture (Note there are major problems with Prevayler itself, but I am describing a style).
- Memory access bottleneck: Someone with more hardware knowledge, please enlighten me.
- Failover: Ok, buy two E25Ks. Have the second one always recovering.
- Code Changes: This one strikes me as more difficult. Getting adequate web server performance requires compilation. You need a development infrastructure that allows dynamic modification of code.