Shipping Wikipedia in 3 seconds
Last week on Friday September 16 – right ahead of the weekend – our servers had a good reason to celebrate. They were delivering a record-high 4.5GB of avast updates to our users EACH SECOND. AVAST has over 130 million active users, but of course not everybody turns on their computer every day. On average, about half of all users turn on their machine every day and receive an update. Friday was therefore not quite ordinary, as the 4.5 GB-per-second-traffic-fury lasted for 1 hour before subsiding to normal levels.
I was thinking about how to convert the quite virtual ‘Giga Bytes’ number into something I or others could actually understand. Someone suggested I should compare the traffic bandwidth to a whole country. And sure, there are countries with smaller bandwidth than we had last Friday. But while I searched Wikipedia for the right example it struck me that the best example would be to use Wikipedia itself.
Guess what. According to its own statistics, the whole content of Wikipedia is today 14 GB of data or equivalent of 1577 books each with 1000 pages and over 1 million words. And we were delivering this volume every 3 seconds. Not bad for a Friday
Btw, for those interested, we use our own infrastructure of 300 servers in 10 datacenters located in 4 countries.