We could go on and on about the many ways that vendors have taken advantage of the deconstruction of centralized computing. While they can wrap themselves in the flag and claim that open computing broke the domination of all computing by a single vendor (IBM), vendors must also concede that this also ushered in an anarchical era in storage technology. In the world of contemporary open systems storage, one in which vendors manifest the Hobbesian ideal of self-interested opportunism, and life for storage consumers is nasty and brutish, if not altogether short. At the time that the first Holy Grail book was being developed, the late 1990s, consumers didn't pay very much attention to burgeoning issues like storage management. For as long as the economy was robust and storage devices kept dropping in price, consumers were content to simply buy more, with little or no attention paid to the ultimate price that unmanaged storage would exact. Today, we are at last feeling the "two towers " of storage pain: the need for cost-effective capacity provisioning in a "do more with less" world using inadequate tools, and the need for a data protection strategy that will endure in the face of an unmanaged infrastructure and threats that seem to be growing daily. From a technical perspective, most of the problems of provisioning could be effectively addressed today by a combination of true, standards-based, cross-platform LUN carving-and-splicing technology (true virtualization), a robust global namespace for file storage, and a more effective management strategy based upon an open, standards-based scheme of self-describing data and access frequency-based data migration. In terms of finding space to store all of that exploding data, there are, on the horizon, a number of additional technologies that promise to expand the areal density limits of magnetic storage dramatically. These include
The point of this brief survey is that new technology that scales well beyond the limits of magnetic disk is only a few years away. While it might sometimes seem appealing to do as little as possible to solve our current storage problems in anticipation of limitless storage space on a sugar cube, practical necessity dictates otherwise . Many organizations are already at a crisis point when it comes to unmanaged storage costs, and they need solutions now ”hence their willingness to try half-baked technologies like FC SANs. However, the promise of new technologies should not be allowed to enable a whole new generation of IT folks to unlearn the hard lessons about storage management that are being foisted upon organizations today. While higher density, faster access storage might forestall some of the issues around data provisioning, this technology does not address the second tower of storage pain: data protection. Today, an obscene amount of mission-critical data remains at risk. Despite numerous events that have pressed data vulnerability into the forefront of business and IT thought, very little has actually been done to rectify the situation. Case in point, within the past year, I had the opportunity to chat with a storage manager for a U.S. federal government agency responsible for printing all of the checks for civilian agencies and departments. The fellow noted that not one of his hundreds of servers had ever been successfully backed up. This was particularly disconcerting because of the close proximity of his data center to the Pentagon, which was targeted by terrorists in the infamous September 11th attacks. Said the manager, had the plane diverted its course only a few degrees and flown a few more miles to where his data center was located, the U.S. government's abilities to pay employees , service providers, and others would have ceased to exist. [6] For its own part, the federal government has produced only a weak mandate in the area of data protection (as opposed to data security and long term retention, as discussed below). In response to the attacks on the World Trade Center, several financial agencies did convene a panel to look into the efficacy of mirroring arrangements as a disaster protection measure. They discovered that those organizations that had established storage mirrors across the Hudson River certainly fared the attacks better than those that didn't, but they were rightfully concerned that the location of mirrored data centers ”within a 30-mile radius of a "target rich" environment like New York City ”left them susceptible to the geographic reach of other types of attack scenarios. Just when it appeared that the Office of the Comptroller of the Currency, the Security & Exchange Commission, and other agencies involved in the panel were going to impose some significant distance requirements on backup facilities ”thereby placing a burden on storage vendors to develop some real data protection strategies ”they backed off the issue completely, stating that they did not have the authority to proscribe distance requirements. [7] While the legal mandate for data protection in the context of disaster recovery continues to be weak, this is not the case with data protection from the standpoint of security and privacy. Regulations and laws, borne out of financial scandals and concerns about healthcare patient privacy, are today exacting a toll on organizations from the standpoint of storage security provisioning. Dealing with storage security, as discussed in this book, will require the adaptation of medieval security techniques to an entirely new threat paradigm: one in which the bad guys may not be interested in the contents of the castle, but only in the pleasure of vandalizing the castle itself. In the new millennium , you no longer need a motive to do bad things. For many computer criminals, the answer to the question of their motivation is simply, "Why not?" The latest developments in the fast-moving world of network security ”developments that may find their way into the realm of storage security as well ”are technologies like Invicta Networks' Variable Cyber Coordinates. [8] This technology, patented by a former KGB major and cryptography expert who defected to the United States in 1980, is simple in concept. Basically, it provides security for a network connection by making it "invisible" to would-be eavesdroppers. This is done by rapidly changing the logical network addresses of the communicating end stations . The core of the technology is an algorithm (which is also claimed by BBN Technologies) deployed at each of the communicating endpoints that changes addressing information at the rate of many times per second. Currently, implementation of the algorithm is in the form of a proprietary system that includes a secure network card that must be installed in each communicating system, a secure gateway that must be installed in each LAN, and a security control unit that is used to implement and manage the algorithm-based protection itself. The approach sidesteps notions such as secure operating systems, firewalls and payload encryption ”techniques that have seen billions of dollars in research and development investment but produced little meaningful return in light of increasing incidents of computer crime. Even skeptics seem to be warming to the idea because of its simplicity and the fact that it eliminates the difficulties associated with firewall customization and key management. It remains to be seen whether innovations such as VCC will make a difference in how we secure the data assets of our organizations going forward. For now, the key issues confronting storage security are less about technology than about training ”and the cultivation of data management as its own profession. |