There are a lot of issues to address when trying to determine the hardware and bandwidth requirements. The following sections attempt to address most if not all of them. How Many Players per Physical Server?This will depend on the game's design and how the design affects the size of the world, where critical in-game facilities such as banks, training masters, shops , and other player gathering places are located. The tradeoff on this is the more players in one place, the better the socialization , up to some point where there are so many in one location that server-side lag becomes a problem. This points back to the design and doing your best to anticipate population problems before coding begins. How Many Servers per World Iteration?The tradeoff: The more physical machines per world iteration, the more people you can simultaneously host per iteration. More machines, however, means higher hardware costs. Also, you have to consider how many players to host per physical server vs. the size and popularity of the "world" terrain that server machine hosts . If each physical machine is designed to host 500 simultaneous players, but the region of the world is of such interest that more than 500 regularly congregate there, what is that going to do to performance? Multi-Processor PCs or Suns?In most cases, the cheaper alternative is multi-processor PCs, if you have a firm handle on how many machines you need per world iteration. The value of dual-processor PC motherboards is in faster traffic-handling at the server end. Both Sun and Intel announced in December 2001 that they are working on multiple processors on one chip, tied together using simultaneous multithreading to allow each processor to handle two or more application threads simultaneously. [1] This should eventually decrease the costs of server farms, as one physical machine will probably be able to handle more traffic and players.
Just plain clumping commodity, single-processor PCs isn't advised for an application as intensive as a multiplayer game. They aren't designed for this type of job. The higher cost of multi-processor PCs is generally offset by better performance and less downtime, which result in higher player satisfaction. How Much RAM?The easy answer here is "as much as you can shove into the machine." You should have no less than 1GB of system RAM for any type and style of online game, and as much as you can reasonably afford is best; there is no such thing as too much. Will That Be One Hard Drive or Five?The view on this among developers we've talked to is split. Some prefer having one large hard drive with a backup in a fault-tolerant configuration; some prefer two or more relatively smaller drives with one fault-tolerant backup drive in place. Purely for redundancy's sake, it makes sense to split the load off to more than one drive if your technical design allows for it. Regular, multiple backups are mandatory in any case; there hasn't been an online game in existence that hasn't had a catastrophic failure that required backup game data to be loaded onto live production servers. To Linux or Not to LinuxThis one has become almost a religious argument in the community. Microsoft backend products such as NT/2000/XP tend not to be as stable or cost effective as Linux. Windows NT/2000/XP isn't considered as stable as Linux, can't host as many people per server before server-side lag starts setting in, and costs money to license versus the free use of Linux. On the other hand, NT/2000/XP is generally easier for most engineers to work with, and Microsoft and their affiliated training partners have pretty good training programs and materials available. In general, most shops are going with Linux because it is free, open source, more flexible, and generally more scalable than NT/2000/XP. Bandwidth: How Big a Pipe Do You Need?This is one of the key questions in this whole process. Bandwidth is one of the few variable costs you'll have to contend with; it is also one of your biggest expenses because you pay a peak usage rate; that is, the higher your peak usage, the higher your rate. This isn't like a personal account with your ISP, nor is it like having an account with a phone, water, or electricity provider. It's more like a highway ”it's going to be as wide as you build it, and you need to build it for your busiest rush-hour traffic, even when nearly everyone is home sleeping. Going over a peak usage cap is expensive. (Think rush- hour traffic that is so congested that you need a helicopter to evacuate a seriously injured motorist. Compare the cost of a helicopter with the cost of an ambulance.) Controlling bandwidth usage is critical to the profit margins for the game, yet it is rarely a consideration in the design stage of a project. Designers don't want to be shackled to a bit rate target number; they want to shove as much data down the line as they can because that gives them leeway to add more features to the game. They are rarely challenged on this during the design process because, due to inexperience with the process, executive producers and other leaders rarely even think about it. It is generally during the latter stages of Beta testing, when the data transfer figures are run to determine how much bandwidth needs to be ordered for launch, that executives see the potential cost, realize that bandwidth is going to eat up a bunch of margin points, and turn white with fear. Thus, one of the first ground rules an executive producer has to lay down is a target goal of how much data is going to be transferred back and forth between the player and the game servers. The bit rate isn't a sexy thing to work on, but there simply must be some common-sense goal to shoot for that won't break the maintenance budget in bandwidth costs after launch. This target can be refined as development proceeds in the testing phase, but not having such a target makes it very likely the game's bandwidth costs will be out of sync with the rest of the budget. In the US and European markets, a good goal to shoot for is 4 “6 kilobits per second (kps)/player or less. You'll find it is difficult to find a living space in that range; several of today's more popular games live in the 8 “10kps range. If you can get the bit rate down to 2kps, you're "golden." It's hard to see how that can happen, however, without putting dangerous amounts of data directly into the client, which is just asking for trouble from talented cheaters and hackers. The problem with "golden" is that it's the part of the flame between red and blue. After you have made your code as elegant, streamlined, and compact as possible, the remaining technique for reducing bit rate is to have some parts of it reside on the client side (each player's computer) instead of on the server side. As code is shifted from server to client, players have access to more critical functions. Most players just want to have fun with your game, but some players would just love to "have fun" with your code. Their definition of "fun" will cost you money and time. Asian markets generally have more tolerance in bandwidth because the governments there tend to lay a lot of fiber- optic bandwidth and offer price supports to keep it inexpensive. South Korea is the best example of this, and it shows in how Korean PWs use bandwidth for games; the average seems to be a 30MB connection to support a server that can hold 1,000 “3,000 simultaneous players. This is also one reason why few Korean games will be appearing in the US market until massive recoding and optimization are done. Then there is the consideration of the player's connection to the Internet. Hard- core gamers tend to upgrade hardware much faster than moderate or mass-market gamers. In general, they are seeing better Internet performance right now, especially in the US, where less than 8% of households have broadband access as of February 2002. There are plenty of myths abounding about that access, however. Bandwidth Will Not Save YouBandwidth: It's a clarion call, a wizard's chant to create the spell of no-lag. All we need is more of that super-big, mystical stuff, "They" say, and all will be well. More bandwidth, "They" say, translates to more speed for data. You know the line: big pipes, no waiting, and an end to the nefarious lag monster. Imagine 50 “80-millisecond latency rates for everyone! We could play all those Internet action games and flight simulators and the frame rate might actually match the data transmission rate. And cable modems and DSL lines, those deity-blessed saviors known collectively as "broadband," will give us that access, "They" say. Why, as soon as everyone is on a cable modem or DSL line, we'll all experience low ping times, and playing a session of Quake III or UO will be a lagless exercise worldwide. Broadband, the experts trumpet , shall save us all. Understand something up-front: What you hear about broadband these days is marketing fluff, and it's about as honest as marketing fluff ever is. That is to say, it is riddled with misdirection , incomplete information, and lies by omission. All the marketers want you to see is the perfect case; the reality of the situation can wait until after you've plunked your money down on the table. What "They" want you to see and believe is that broadband in the form of cable and DSL will remarkably improve your Internet performance; what "They" don't want you to see is that bandwidth is only one part of the puzzle and that all parts have to be fixed for broadband to have any lasting effect on lag. If you believe we're saying that certain cable companies, cable access providers, DSL providers, and content providers ”the ubiquitous "They" ”are fudging the truth about the efficacy of broadband access for their own purposes, score yourself a 10. Let's have a little reality check:
So before you plop down $40 or more a month for broadband access in the expectation of superior gaming, understand this: Broadband will not save you ”not for a long while anyway, and not until a lot of routers and servers are replaced with newer equipment, pressure is relieved at the chokepoints of the Internet, and more programmers learn how to code games for more economical data transfer. Yes, you probably will see a performance increase, but it will not be the Nirvana-like experience promised , and it will get worse over time as more people subscribe to broadband outlets. What does all this mean? It means that programming a game (or web site) to appeal to broadband users is going to actually cost you more in bandwidth. If you are willing to suck up this cost and can afford to pay for it, that's one thing. Just make sure you go in with your eyes open; more data transfer might make for a better online game, but it also might drain your pocketbook faster than you expected. Co-Locate Your Servers or Set Up a Big Farm at Home?Where to place your servers is a question you need to consider because it will have an impact on how much physical space you need and how many operations employees you need to hire to launch. If you're planning on being published/hosted by a third-party publisher, you'll want to know how they do it because it will have an impact on their bottom expenses. To "colocate" simply means to place your hardware at someone else's network operations centers (NOCs). The big Internet backbone providers, like Sprint and Exodus, have them all over the country and either own NOCs internationally or have cut deals with firms overseas to provide that capacity. This one is a toss-up and may depend greatly on whether you're going after an international market right away or sticking with home territory for a while. You can see examples of both methods in the US: EA's UO colocates servers in each US time zone and in the international territories it services, and Sony Online Entertainment's EQ uses one large farm in San Diego for US players. Another major tradeoff here is in the potential for player-side lag versus close-at-home control of the hardware. By co-locating hardware at the NOCs of a major backbone provider, you give the option to the player to reduce the overall number of Internet data transfer hops from home to a game server, which generally reduces lag time. The big issue is cost. Setting up your own NOC can be expensive, and not just in hardware, software, building a clean room to house the servers, and leasing bandwidth to connect it to the Internet. You also have to have operations people to monitor the NOC 24/7/365 and fix problems as they arise. At a bare minimum, you need at least 6 people to cover the 21 8-hour shifts in a standard work week, and that doesn't take into account vacations , sick time, or emergencies like having to take the dog to the vet or picking up ill children from school. Most companies try to slide by this by having the servers page an on-call operator at home if they go down, but this works about as well as you'd expect; in the truly critical incidents, when thousands of customers can't connect to the servers, the Law of the General Perversity of the Universe dictates that the server paging software will fail. This is especially true on holidays, patch days, and for highly anticipated scheduled events in a game. Co-locating at a backbone NOC can solve many of these problems. It isn't a fail-safe solution, but at least you don't have to build a NOC; you can spend that money on operators to watch the servers and correct problems. |