rants -- why does the internet suck?
latest rant: 2006-02-01 06:22:56



why does the internet suck?


<< previous rant:
the internet sucks

of course you know that the internet sucks, but do you know why it sucks? well, let me tell you...



uninformed sysadmins

this is not supposed to be an issue, however many sysadmins (and programmers as well) are not properly trained on what they should and should not do with their computers. unfortunately, there are no training requirements for running a server (or a computer for that matter). this means that many computers are out there improperly configured, and all of those improperly configured computers and software programs are just waiting for some attacker to exploit them.



lack of security

one of the reasons why we have problems such as spam, hackers, and viruses on the internet can be mostly attributed to a lack of security. in many cases, increasing security in the design of a product often makes it slightly harder to use. in some instances, people would rather trade tighter security for ease-of-use. people don't care about security until someone has stolen their account passwords, credit card information, or maybe their identity.



old / poorly-designed software

there are many software programs currently in use that have serious security holes. often security holes in several programs will be combined to make a successful attack. there are hundreds of ways to anonymously send an email, launch an email virus, launch a denial of service attack, or hack into a server. often the software that was attacked was not designed to handle every possible input that it may receive. if you feed it bad data you can cause things like buffer overflows (feeding more data than a variable can handle), race conditions (conflicts caused by timing two processes just right), boundary errors (feeding in invalid characters), etc. those types of design flaws can open holes that allow attackers to feed in and run extra code that can launch attacks, open backdoors, send spam, etc.



backwards compatibility

the basic problem with widespread change (both on the internet as well as the rest of the tech arena) is the need for backwards compatibility with existing systems. many of the protocols currently in use on the internet were designed 20+ years ago. many of them have outstanding issues that need to be fixed. unfortunately, to fix all of them would basically require that we replace many of them with a completely new set of protocols that were designed with security in mind. of course, we really can't just shut down the entire internet and fix everything over a weekend. so, anything new that we decide to implement will still need to communicate with older systems until everything has been upgraded (which would take "forever" in internet time).



massive installed base

even if we came up with the perfect solutions to all of our computer problems, the sheer number of computers in production would severely limit the speed at which new systems could be implemented. there are millions of computers in use by both end-users (clients) and administrators (servers) that would have to have their software (and possibly hardware) upgraded to begin using more secure protocols. the fact that there are computers around the world still running Windows 98 (almost a decade old) is a testament to this fact.



poor maintenance

some older software cannot be updated/upgraded to newer version because the company that built the software is either a) no longer maintaining that product, or b) no longer in business. if your software is crucial to the operation of your business, then you are less likely to rip out everything and start from scratch. replacing that much software (and hardware) at one time can often lead to just as many security issues (and other bugs) as leaving the existing system in. business don't like doing that because it creates a lot of downtime, and downtime == lost revenue.



vendor lock-in

if you have a large operation that is running a large number of software products on a large number of machines, usually it would be easier for all of your machines to be running similar software. that way it would be a lot easier for all of the machines to integrate software and data, thereby increasing productivity for both your users and your sysadmins. however, if you need a particular feature that your software/hardware vendor does not support, you may be forced to either wait for your vendor to support it, or implement it on your own (possibly causing a conflict with your vendor's support contract).



multiple attack vectors

many systems have multiple points of entry that have to be defended against. for example, a server on the internet may have services running that include email, web, dns, database, remote access, and various others. there could be a flaw in one (or more) of those services, and if there are any possible flaws in the operating system itself, then the combination of security flaws can cause a breach in security. this is extrememly common when you consider that many web sites are designed and programmed by developers that aren't security concious, and are usually forced to release their product while being driven by timetables and deadlines.



this is just a small list of the actual issues that surround the internet. i'm sure that eventually these issues will be resolved, however i have no idea how long it will be before problems like spam and hackers will be considered a thing of the past.