Although I wish this weren't true, the "security problem" might not be more completely solved until governments worldwide start to enact civil and possibly criminal penalties for two groups that are largely ignored by today's cybercrime laws: the vendors who ship insecure software and the users who deploy any system insecurely. I will never forget the famous quote by technopundit Robert Cringley:
If automobiles had followed the same development cycle as the computer, a Rolls Royce would today cost $100, get a million miles per gallon, and explode once a year, killing everyone inside.
Today, organizations are deploying systems with known flaws and insecurities. Although the law is gray on this matter, most locales have not assessed meaningful penalties to either the purveyors of insecure software or those who knowingly deploy software in insecure ways. When you accept a software license today, you are "agreeing" that the software vendor disclaims all warranties that the product will work and not cause harm to your system. You would never agree to the provisions in a software license agreement if applied to something like an automobile, but you often do it with software. Software vendors have lobbied hard with governments to maintain these license agreements under the banner of innovation. If software vendors are liable for the flaws, they say, innovation will suffer.
Although I think this argument is meant to instill fear more than anything else, I think that, if asked, most organizations would settle for not upgrading their web browsers and e-mail clients for a couple years while software vendors figure out their security.
Likewise, if you deploy a network in a completely insecure way, you should have some liability when that network is used to cause harm to others.
All of these ideas come back to the idea of deterrence. The reason violent crime isn't completely rampant in most parts of the world (setting aside theories about the inherent goodness of human nature) is that if you commit such a crime, there is a high likelihood that you will be caught and severely punished. Even though the windows of my house could easily be broken and a burglar could steal from me, I don't spend a lot of time worrying about it because my town has good police protection and my neighbors keep an eye out for one another. In addition, I have insurance that protects most of my physical property.
Computer security does not enjoy the same benefits today. For this reason, organizations spend so much time focusing on keeping any attacker out because they know that a successful attacker would be very difficult to catch. By enacting laws that target the producers of insecure software and the implementers of insecure configurations, this problem can be somewhat mitigated.
Tax incentives should also be considered as a form of incentive for good security as opposed to a penalty for bad security.
I don't like this any more than you probably do. The idea of lawyers and lawmakers fixing computer security does not excite me. This is primarily because my area of experience is in using technical controls to mitigate the need for regulation. Unfortunately, although there are counterarguments to these points, the subject has not yet received the attention it needs in public debate. For example, any approach is rife with issues ranging from the ability to write technically accurate laws, to the impact that these sorts of laws might have on the open source community. I think the industry must have meaningful debate on these issues and stay away from dismissing the idea because it isn't in a particular company's best short-term interests. A news article related to this subject is available at the following URL: http://www.eweek.com/article2/0,4149,1498436,00.asp.