Here is some fun reading for you
http://cyber.com/details.php?id=22§ion=detailpapers
http://www.claymania.com/unix-viruses.html
http://www.cybersoft.com/whitepapers/paper...orks_print.html
http://www.kernelthread.com/publications/security/vunix.html
(oh and by the way ubuntu is the most insecure linux distro of all popular ones, and I am very sure the apache contributors will love to hear they are now part of Ubuntu)
You can run windows perfectly without virus scanner and not have to worry if you use your common sense, just like with linux. It's just better to have protection up for those cases in which there does happen something. I remember with one of those windows remoting virusses I got it a day before it was reported in the wild and got patched, since you couldn't do much against spreading it I decided from then on patching your system and having some security up isn't too bad.
Here is some fun reading for you
http://cyber.com/details.php?id=22§ion=detailpapers
http://www.claymania.com/unix-viruses.html
http://www.cybersoft.com/whitepapers/paper...orks_print.html
http://www.kernelthread.com/publications/security/vunix.html
(oh and by the way ubuntu is the most insecure linux distro of all popular ones, and I am very sure the apache contributors will love to hear they are now part of Ubuntu)
Which proves you know nothing about what I claimed except what you want to think.
I DID NOT SAY that there are no Linux virus. There are loads of them; the very site I referenced, which you apparently did not read, listed a lot of them. What I claimed was that, by running a reasonably current Linux system, one does not get viruses unless one is stupid enough to run an untrusted application without making sure (IE, through a virus scanner) that it is 'clean'. Most talk about viruses in Linux being as dangerous as in Windows are FUD, a lot of it generated by McAffee, Symantec and other 'computer security' companies. Two of the papers you quoted were written by the professionals of one such company, so there's an obvious conflict of interests here; the other two are agreeable and don't say anything that justifies your point that 'Linux is safe because nobody uses it'. There is malware on Linux, but it spreads through user stupidity. Almost all of it will wreck your system only if you're dumb enough to actually run it as root. Virus scanners are useful if you're downloading random crap off untrusted sources and running it. But the notion that Linux systems are in danger merely for being connected to the Internet is pure, grade-A FUD. Not even Windows systems are in significant danger merely by being connected to the Internet after SP2 (Outlook, Office and IE being the biggest attack gates. Not using them will prevent most attacks on Windows, but not all.). And again, name one Linux worm that survives in the wild without needing user interaction.
I agree that the nonexistence of spyware (Which is written specifically for a commercial home-user audience) for Linux is due to Linux having few desktop users, but there is a big untapped market for spyware in Firefox users, which seem to still be immune to Spyware - Simply because Firefox does not enable anything that allows software to infect and be installed on the system by default.
Windows can reach the security qualifications we expect of Unix systems, but this requires a lengthy configuration process that is beyond most users, and some Windows applications are reported not to run very well without superuser privileges. Windows NT was notorious for being able to reach the C2 security certification according to the NSA (Which is what most secure Unix systems have) - Once it's configured, and as long as it's not connected to a network.
I've run Windows XP without antivirus software for years, merely by not using Internet Explorer or Outlook. Most users are absolutely clueless about computer security, however, which is why the vast majority of viruses spread through social engineering.
I DID NOT SAY that there are no Linux virus. There are loads of them; the very site I referenced, which you apparently did not read, listed a lot of them. What I claimed was that, by running a reasonably current Linux system, one does not get viruses unless one is stupid enough to run an untrusted application without making sure (IE, through a virus scanner) that it is 'clean'. Most talk about viruses in Linux being as dangerous as in Windows are FUD, a lot of it generated by McAffee, Symantec and other 'computer security' companies. Two of the papers you quoted were written by the professionals of one such company, so there's an obvious conflict of interests here; the other two are agreeable and don't say anything that justifies your point that 'Linux is safe because nobody uses it'. There is malware on Linux, but it spreads through user stupidity. Almost all of it will wreck your system only if you're dumb enough to actually run it as root. Virus scanners are useful if you're downloading random crap off untrusted sources and running it. But the notion that Linux systems are in danger merely for being connected to the Internet is pure, grade-A FUD. Not even Windows systems are in significant danger merely by being connected to the Internet after SP2 (Outlook, Office and IE being the biggest attack gates. Not using them will prevent most attacks on Windows, but not all.). And again, name one Linux worm that survives in the wild without needing user interaction.
If you do not download untrusted stuff you will never be attacked, google is not trusted so you do not surf to google etc. Because every website you go to is downloading and running something. You do not download unchecked emails etc. Really by your statement that linux is safe if the user does not download and run untrusted stuff also makes windows safe. Have you even run windows servers in high secured mode? Have you adjusted all of windows proffesionals settings to make the administrator account inaccessable and give each program a different user id with only a set of permissions?
You even admit to needing a virus scanner on linux, windows needs a virus scanner. to think however these pratices of having a 'root' or 'administrator' account protects you is idiocy, if one can crack root, a virus can be designed to do it in half the time (and has been done)
want a recent example of a wild virus attacking unix based servers: a multi OS virus infected the Belgian military's http server (unix) , the Belgian military intra net server (tailor made microsoft server) and gave administrator/root access to someone who then utterly destroyed those two servers. After examining the error it was corrected and shared with nato allies, who found out several of their servers where infected too.
These servers are ofcourse not highly secured like our tactical servers, however they are well secured compared to private companies. The reason it got targeted: military servers are just really actractive and can cause fear with people.
I do not say linux is safe because nobody uses it, I know how many attacks a military server gets a day, I know howmany hotfixes the techs do to fix it (and those fixes are not even given to the linux community). However windows viruses are noticed more
Have you read the second edition of 'The hackers guide to linux', quite detailed in securing linux from viruses and attacks, most of it exists out of using third party software, free or not. I bet if all opensource projects where to order distro's to remove their programs from their cds you would be suprised how much you think of being the OS is actually third party software delivered with the os, something Microsoft is not allowed to do due to monopoly laws.
And I never commented security differences on IE and firefox, I thought IE 6 crap, and used firefox for quite some time, I also think IE7 to be quite good and in some sense better as it supports more then firefox.
Once I finish my tour of duty I will however be sure to ask my professor for names of wild datamining worms that effect linux. I still have a whole semester 'securing unix' to go when I return home.
and I included those two biased papers to balance your biased paper.
and to note office and outlook is not an OS, it does not belong to windows so it's security issues are not those of microsoft windows. to argue on their issues is the same as arguing a weapon maker is responsible for a faulthy bullet ruining your gun.
Anyway you have already given in towards windows not being so bad. I suspect this is all you will be willing to give in to for now
You quite clearly know nothing of how the Web works. A significant minority of websites run code locally; those who can, in a secure environment, will have no access to local files without the user explicitly giving it to the server. This is not the case in certain versions of IE, however, but generally dependent on the user allowing ActiveX controls to run. By 'download untrusted stuff' I mean actually downloading a binary file, unpacking it, changing its permissions so it's executable, switching to the root user (Or being stupid enough to be running the root account anyway) and running it. By those standards,
sudo
cat >bin <
sudo
rm -rf /
EOF
...is almost a Linux virus (I'm not going to figure out how to write a quine in Bash for your benefit, sorry.)
You do not download unchecked emails etc. Really by your statement that linux is safe if the user does not download and run untrusted stuff also makes windows safe. Have you even run windows servers in high secured mode? Have you adjusted all of windows proffesionals settings to make the administrator account inaccessable and give each program a different user id with only a set of permissions?
You even admit to needing a virus scanner on linux, windows needs a virus scanner. to think however these pratices of having a 'root' or 'administrator' account protects you is idiocy, if one can crack root, a virus can be designed to do it in half the time (and has been done)
You know nothing of the Unix security model. Obtaining escalation of privileges to wipe out a system is difficult without user interaction in most modern systems, although the fact that sudo allows further sudo-ing for 5 minutes (Or so) after successful input of a password is a possible liability; this attack window, however, is difficult to exploit, particularly as remote code execution vulnerabilities are rare to inexistent, and usually rely in user space software reading specially created files that cause a buffer overflow. Network software for Linux is generally very robust due to its maturity, with notable exceptions, however. Caveat sysadmin, as usual.
want a recent example of a wild virus attacking unix based servers: a multi OS virus infected the Belgian military's http server (unix) , the Belgian military intra net server (tailor made microsoft server) and gave administrator/root access to someone who then utterly destroyed those two servers. After examining the error it was corrected and shared with nato allies, who found out several of their servers where infected too.
These servers are ofcourse not highly secured like our tactical servers, however they are well secured compared to private companies. The reason it got targeted: military servers are just really actractive and can cause fear with people.
1) Was that autonomous malware, or a directed assault?
2) What software package was affected? Saying 'linux' is really vague.
3) Were only NATO systems affected?
4) DoS and DDoS attacks are not always caused or related in any way to viruses, and no system is wholly immune to this kind of thing.
As always, never attribute to malice that which can be satisfactorily explained by stupidity.
I do not say linux is safe because nobody uses it, I know how many attacks a military server gets a day, I know howmany hotfixes the techs do to fix it (and those fixes are not even given to the linux community). However windows viruses are noticed more
You do realise that not releasing said fixes is immoral and in possible violation of the GPL, depending on certain details.
Also, a successful worm that brought down 5% of Linux server boxes could cause thousands to millions of dollars in damage and quite possibly wipe out important sections of the internet. I bet that'd be noticed; however, there hasn't been a single instance of that happened, while we have heard of several highly damaging pieces of Windows malware spreading and causing damage.
*Bzzzt!* WRONG, but thank you for playing.
Microsoft is not allowed to distribute its own software along with the operating system. Distributing third-party software (Say, Acrobat reader, Flash Player or what-have you) is (Caveat: IANAL) fine, and routinely done by OEMs.
Also, Linux is a massive collection of disparate software packages, and nearly all distributions are 99% 'third-party'.
And I never commented security differences on IE and firefox, I thought IE 6 crap, and used firefox for quite some time, I also think IE7 to be quite good and in some sense better as it supports more then firefox.
Once I finish my tour of duty I will however be sure to ask my professor for names of wild datamining worms that effect linux. I still have a whole semester 'securing unix' to go when I return home.
You keep talking about those worms, but I still have no reason to believe they're anything but fictional. Vague information on how horribly dangerous they are doesn't count, either; there are lots of worms out there that attacked vulnerabilities that had been known and patched for months at the time of the worm's deployment. Yes, attacks on specific servers do happen, but you don't get what happens with Windows - Massive damage caused by zero-day unpatched attacks, for example. Of course some types of Windows malware (Like spyware and botnet software) don't spread into Linux because there is a commercial interest behind them targeting a mass of home users specifically, but Linux has more servers than Windows and seems to experience a fraction of the malware.
and I included those two biased papers to balance your biased paper.
and to note office and outlook is not an OS, it does not belong to windows so it's security issues are not those of microsoft windows. to argue on their issues is the same as arguing a weapon maker is responsible for a faulthy bullet ruining your gun.
If the weapon maker made the ammo, the argument is obviously valid. Outlook (Express) and Internet Explorer come bundled with Windows; IE is effectively a part of the OS shell; and Office is the standard productivity suite in most Windows deployments. I encompass them in Windows security because a majority of Windows deployments are running those packages as well, just as compare Apache and IIS when comparing Linux and Windows server security. Bare operating systems running no applications are secure. That didn't stop the Blaster worm, however (Which I remember getting... oh, the pain.). There are no comparable worms for Unix systems.
Windows is bad for many, many reasons and security is merely one of them - It CAN and SHOULD be secured against MOST infections (Even though things like the aforementioned Blaster worm would probably still happen to many, many supposedly 'secure' systems) but ISN'T, due to very, very bad factory settings (A problem which SP2 mitigated).
I were more thinking of not opening spam mail or visiting obscure sites. True though, people who you first put in front of a computer got an ability of believing everything, then this nice popup that says "you have a computer problem, fix it with our free software" comes by and people click it. Hell I admit when I first used the internet I fell for that, luckily it didn't work on netscape back then.
Maybe another reason why windows is more "unsafe" is because there is a lot more people using it who aren't really computer savvy? And does microsoft do it's best enough to inform people of all the dangers of the internet or isn't that their job? I know my ISP lately has whole info packages for when you first get a connection + the option to get computer lessons if you'd like those which seem to be rather popular. Maybe it's more the job of the ISP's to inform people? *attempt to derail an already derailed thread into further discussion*
Maybe another reason why windows is more "unsafe" is because there is a lot more people using it who aren't really computer savvy? And does microsoft do it's best enough to inform people of all the dangers of the internet or isn't that their job? I know my ISP lately has whole info packages for when you first get a connection + the option to get computer lessons if you'd like those which seem to be rather popular. Maybe it's more the job of the ISP's to inform people? *attempt to derail an already derailed thread into further discussion*
No, it's the task of malware writers. If you keep losing all your files and still haven't learned not to be gullible, then the loss of time and energy is pushing you out of the gene pool, which is what is supposed to happen to dumb people.

ETA: That was a general 'you', before any forum mods get trigger happy with the personal attack stick.
You know how I stayed safe? Same way a Unix user stays safe. Don't be an idiot.
I loved Firefox's tabbed browsing. It's why I used it. I found it incredibly annoying how non-functional it is. It's 100% intolerant of common HTML errors. Yahoo does not function on it. My school's website does not function on it. .wmv compatibility is buggy. Real Player tends to crash the thing. Shockwave opted not to retool their software for Firefox. The memory leak is atrocious on older computers. Quite simply, Firefox was an inferior piece of software with a very convenient feature.
Microsoft vulnerability is a myth.
You know how I stayed safe? Same way a Unix user stays safe. Don't be an idiot.
I loved Firefox's tabbed browsing. It's why I used it. I found it incredibly annoying how non-functional it is. It's 100% intolerant of common HTML errors. Yahoo does not function on it. My school's website does not function on it. .wmv compatibility is buggy. Real Player tends to crash the thing. Shockwave opted not to retool their software for Firefox. The memory leak is atrocious on older computers. Quite simply, Firefox was an inferior piece of software with a very convenient feature.
Microsoft vulnerability is a myth.
I'm not sure I would call it a myth, however Microsoft does have more users who don't update and have general safe serfing habbits. Same like Sym(something) based Smartphones have tons of viruses for them aswell. (75% world wide marketshare, 13% US marketshare)

It was supposed to be funny.
Intolerance of errors is a good thing, you know


It is not. Intolerance of errors is what makes otherwise decent information inaccessible. Intolerance of errors is seen as a flaw in every piece of software I know. Extending the same principle to HTML because people who forget to add onto their code are inferior programmers is just silly.
And seriously, have you looked at who sits on W3C? Go look. You have two people who have competing proprietary interests, one person who represents a company which has real interest in seeing Internet standards beneficially involve, and...a whole bunch of people who represent large IT companies with minimal stakes in the issue. Basically, 10% of that board has a real interest in making the project work. It makes more sense for phone companies to get together and work out .mobi protocol than it does for that poor soul from Nokia to try to talk to people from Boeing and IBM. It makes more sense for major browser programmers to listen directly to webmasters, then turn around and work with OS people.
And seriously, have you looked at who sits on W3C? Go look. You have two people who have competing proprietary interests, one person who represents a company which has real interest in seeing Internet standards beneficially involve, and...a whole bunch of people who represent large IT companies with minimal stakes in the issue. Basically, 10% of that board has a real interest in making the project work. It makes more sense for phone companies to get together and work out .mobi protocol than it does for that poor soul from Nokia to try to talk to people from Boeing and IBM. It makes more sense for major browser programmers to listen directly to webmasters, then turn around and work with OS people.
The reason we have standards and standards committees is so that all browser developers can be on the same page about things. It's meant to make information accessible; if you have 5 different browsers, each supporting a different set of features, then writing websites is five times harder. Almost all pages developed with CSS need special, ad-hoc, essentially broken code to work with IE6, and many pages that worked in IE6 are now broken in IE7 in creative and interesting ways. The reason is that IE disrespects the standard in ways that make IE more complex internally, and break its CSS support.
Firefox isn't intolerant of errors - It won't refuse to display information - But it obeys the standard. This is a good thing for 99% of webpages which are coded correctly, or that at least work according to the standard- Forgetting to close body and html doesn't break the webpage, for example.
Standards are a boring, annoying, but viciously necessary evil. If you want an example of what happens when there is no standardization, look no further than your mobile phone - Namely, its charger. Different makers of cell phones use totally different chargers, and different models use different chargers as well as different voltages and pin settings - A mess. This means that vendors can force you to buy an excessively priced charger (For your car, or so you can have a second one to leave at work to charge it, or whatever) from them only, or risk frying your phone with poorly-made third-party replacements. It also means that you can't use someone else's charger if your mobile phone is dying, and that public charging machines are difficult to build, difficult to maintain, probably unavailable and likely not to work with your brand-spanking-new mobile phone.
XHTML is a markup language. It is made to display content in a way that is consistent in a variety of display devices, in accordance with the capabilities of those devices, with a focus on displaying the same content while presentation adapts to different devices. Webpage developers should be able to reasonably expect that regardless of how they evaluate and display the page, all different browsers and display engines on one class of device should correctly display the content, degrading graciously when they can't display a class of content (As is the case with text browsers, for example, that can't display images) according to predefined, universal rules. When browser developers start breaking this compatibility, they can't, and then web developers need to do extra work to compensate. This leads to incompatibility and to large amounts of content being inaccessible to numerous users... in short, something not entirely unlike the early days of the web when Netscape and Internet Explorer were still on a relatively level playing field, and pages were made for one or the other.
Besides, extensions to HTML created by browser developers are almost never a good idea -

It was supposed to be funny.
We know how to turn things in to a vicious cycle of hate.

Why?
Because the buttons are prettier.
And for me, it's easier to navigate.
I don't know about the Firefox leaks. I do know everything Microsoft has a small glitch. This glitch is, running a server or leaving a computer constantly running with a Microhell OS or any MS program open on the desktop is a bad idea. Microsoft products have a tendency for decaying code...after about three days to a week, you'll have to reboot a server running on Windows or any of the like, to avoid locking up and crashing.
Every time I've run IE, I've run into an incompatibility issue with different pages, repeated broken code, and I'm constantly having to run virus/spyware scanners.
Just personal experience and Cisco class training, don't know if it helps.
But Firefox ftw, because of the pretty buttons.

Another reason why I think standardisation is good and what people often don't think about are devices like readers for people who are blind. I did a course on linux a couple years back and there was a blind person in my class. He was great with computers despite the fact he couldn't see since he had a special laptop setup with a braille keyboard and a reader program. He often complained that he couldn't access sites because he used firefox on linux and badly coded sites messed up. Sadly even sites he needed to get assistance with his handicap. That really got me thinking.

And Veri: In a site with a table that doesn't have explicitly closed lines (most don't), forgetting some combination of body and html closing tags (I forget if it's missing both, or missing one or the other) does indeed prevent Firefox from displaying the page. Firefox will even be so kind as to tell you this when it refuses to display the page. Is it sloppy coding on the part of the programmer? You bet. Does that make it any less of a shortcoming of Firefox that it's less fault-tolerant most of its competition? You tell me.

The one that the authority declares as the standard. Otherwise, you have as many standards as there are browsers. A program that fails to properly conform to the standards is either misdesigned or buggy. If it was designed from the ground up with no intention of conforming the the standards, then it's not what it purports to be - Software that displays HTML but not according to the W3C standard is not a web browser.
And Veri: In a site with a table that doesn't have explicitly closed lines (most don't), forgetting some combination of body and html closing tags (I forget if it's missing both, or missing one or the other) does indeed prevent Firefox from displaying the page.
Firefox will even be so kind as to tell you this when it refuses to display the page. Is it sloppy coding on the part of the programmer? You bet. Does that make it any less of a shortcoming of Firefox that it's less fault-tolerant most of its competition? You tell me.
When writing an XML handler (Which is what HTML display engines are at their core), it makes the code viciously simpler (And therefore faster, cleaner, easier to modify and maintain) if you don't write additional code to handle viciously broken XML. Is the additional cost of complexity really worth it to cater to the minute audience of people writing really, profoundly broken XML? Generally, if you can look at the source and understand what it's supposed to be, then Firefox will display it 'correctly' - If you can't, then you can't hope that Firefox will have a better guess than you.
Writing simple, functional code instead of trying to compensate for the incompetence of other people is not sloppy coding - Nobody has ever written an interface that does what you mean, rather than what you tell it.
For the record, this snipped of HTML is among the ones that I tried:
       Â
       Â
Sample text.
       Â
Another paragraph of sample text; note how the previous paragraph was never closed. Horror, shock and horror.
       Â
           Â
               Â
2 Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â | + Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â | 2 Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â | = Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â | 5 This HTML is wrong - Wrong, wrong, horrifyingly, scarily wrong on many many levels. But it 'works' on Firefox. Of course, as the complexity of pages laid out with tables (A horrifyingly bad idea in these enlightened days of CSS) increases, the chances that what the webdesigner intended and what he actually told Firefox to display will diverge increases. You can't expect poorly written code to be interpreted correctly, at all, and should be thankful when it is. All sorts of problems can be avoided simply by running your HTML through a validator. ![]()
Privacy Overview
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings. If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again. |