That's a fallacy that I'm frankly tired of hearing. Unix was designed from the start to run multi-user environments, whereas Windows grew out of DOS, which was initially built for single-user, non-networked environments. The difference in their initial goals led to wildly different security models.
The average Unix user runs without superuser privileges most of the time. Typically, the less experienced a user is, the less privileges he has in a Unix environment [1].
At least as recently as Windows XP [2], the average Windows user ran his computer with an administrator account on a daily basis, which opens the user to much greater damages from malware. When I used Windows on my own computers, I always setup a non-administrator account for daily use, but I'm experienced enough to know (1) how to do that and (2) that it's a good idea. This suggests that Windows has an inversion of privileges compared to Unix. That is, the most experienced users grant their daily account the fewest privileges, whereas the least experienced users operate with administrator privileges.
[2] Windows XP is the latest version of Windows that I've had enough experience with to say what the average user's setup is like. I hear that the situation has improved a bit with Windows 7, but when I was an intern at Microsoft, every one seemed to run as an administrator on their Windows 7 machines, so I'm not convinced that it's any better.
The idea that viruses need "superuser" to perpetuate themselves is itself a fallacy. Why do I want superuser if I can grab all your browser cookies, dump or exploit your address book, persist in ways no normal user can detect, and gain full access to the network you're connected to?
I'm not a Windows user. Since age 13, I have spent a total of one (1) year in Windows, in 2000, when I ran a Solaris to WinAPI ACE_wrappers port for my startup. I cut my teeth on 386bsd, installed from approximately 900,000 3.5 inch floppy disks.
What I am is a security person, and these arguments about Windows being a petri dish for viruses strike this security person as BS. Computers are a petri dish for viruses, and the smug Unix weenie attitude of "we solved that with su" drives me nuts even before we get to analyzing how long any Unix operating system has ever gone without a well-known privilege escalation flaw.
How will you infect an executable without superuser privileges? My executables in /bin and /usr/bin are r-xr-xr-x. If you're not infecting files on the filesystem, then what you have is not a virus [1]. Without a virus, you're left to exploit bugs in userspace software. If you have a way to exploit Chrome to read my cookies, how is that a virus and what does that have to do with the OS? I would expect that exploit to work on any platform that runs Chrome.
In regards to your first post about popularity, do you think that all of those Unix web servers out there are not a juicy target? How valuable do you think it would be to a virus writer to be able to infect Google's datacenter?
The rest of your comment is name-calling and self-congratulatory back-patting, which does nothing to present a cogent argument.
[1] Executables aren't the only files that can be infected. You could infect a user's PDF, JPEG, or other files that are then interpreted by a vulnerable executable.
(a) You don't need to infect executables. .profile works nicely.
(b) How valuable do you think it would be to a virus writer to infect Mastercard's data center? It isn't riddled with viruses.
(c) If you have a population that accounts for 80% of the market which is only 20% saturated and another that accounts for 5% of the market, why would you ever, ever, ever write for the 5% market? We haven't hit "peak oil" for malware yet.
(d) Your footnote makes my point. Thanks.
(ps) the congratulatory back-patting is to head off the inevitable Linux advocacy "you're a shill for Microsoft" BS that comes bundled with these discussions.
To be fair, I don't think the points we're arguing are mutually exclusive. You seem to be arguing that Unix can be infected with viruses. I'm not refuting that claim. I have no illusions of 100% security. I'm claiming that Unix is more secure by design. What I'm refuting is this claim:
> WinAPI is no more hospitable to viruses than Linux is.
I think that implies that all operating systems are created equal (at least as far as security is concerned for this discussion), or that Linux is more hospitable to viruses than Windows. I think the idea that all operating systems are created equal is laughably false. The second idea---that Linux is more hospitable to viruses than Windows---is a much more complex issue. Proving that there is at least one way to infect Linux with a virus does not prove that point. All that proves is that Linux's security is less than 100%, which I agree with (hence the footnote in my previous comment).
In other words, your argument:
S(Linux) < 100%
and my argument:
S(Linux) > S(Win32)
can coexist:
S(Win32) < S(Linux) < 100%
Your economic argument about OS market share is more relevant to your other claim:
> What it is is popular enough to be worth targeting.
Your economic argument proves that claim. I agree. However, being less popular doesn't preclude Linux from being less hospitable to viruses.
> (ps) the congratulatory back-patting is to head off the inevitable Linux advocacy "you're a shill for Microsoft" BS that comes bundled with these discussions.
Fair enough. Those "you're a shill for Microsoft" type comments do have a tendency to show up in discussions like this. I like a high signal-to-noise ratio in conversations, which is why I called you out on that, but now I see you were trying to keep the content-less comments out as well.
You just restated the previous threads and added some notation, but provided no new evidence to support the argument that Linux is more secure than WinAPI by design. What do you want me to do with that, restate all my arguments again? That seems like a waste of time.
We're talking about the security of single-user machines --- of which most servers are a special case thereof. The perceived significant difference between the two platforms simply isn't there.
>(a) You don't need to infect executables. .profile works nicely.
Yep, low privileges only isolate viruses. A virus running as superuser can infect the entire system. A virus running as a low-privilege user can only infect what the user has access to. And all this applies equally to Windows as to Linux.
>(b) How valuable do you think it would be to a virus writer to infect Mastercard's data center? It isn't riddled with viruses.
Have you heard of Stuxnet and how it infected nuclear reactors?
Superuser used to matter for viruses that needed to escape detection (i.e. install themselves in the MBR, boot sector, kernel, and/or "embedding area" as grub calls it). Modern viruses are more likely to be targeting the data of users not experienced enough to know what a boot sector is, or why that fluffy_bunnies.doc is dangerous. Correct me if I'm wrong, but I believe a modern "virus" would've been traditionally referred to as a worm, as was the Sasser worm, since they're usually not infecting existing executable code.
No idea why this comment has so many upvotes. This is an awful comment because literally every single point in it is factually incorrect.
> Unix was designed from the start to run multi-user environments, whereas Windows grew out of DOS, which was initially built for single-user, non-networked environments. The difference in their initial goals led to wildly different security models.
No. [1, Section 2.2]
> The average Unix user runs without superuser privileges most of the time. Typically, the less experienced a user is, the less privileges he has in a Unix environment.
No. The most popular Linux distribution lets you run any command as any user by default. [2, Default Sudoers File]
> Windows has an inversion of privileges compared to Unix. That is, the most experienced users grant their daily account the fewest privileges, whereas the least experienced users operate with administrator privileges.
No. This isn't the case with the Windows that is shipping today.
--
One final thing to consider. Who cares about separation of privileges if your OS is full of privilege escalation exploits? Hint: one of these operating systems spent billions of dollars hardening their OS and the other is full of holes.
I too was a Microsoft intern and you're completely wrong. They ran as user level accounts with administrator escalation privileges. This is equivalent to running as a normal user account with sudo access in Linux.
I do in theory agree that a user (wrt Windows) shouldn't be running their machine with admin privileges-it's certainly how I do my day-to-day work in Unix environments. I tried running Windows with a watered-down user account, but I found that all it did was cripple my capabilities while hardly affecting those of the malware that would infect my machine.
About Windows and administrator privileges: it wasn't even the user' fault! I tried to run Windows XP as an ordinary user and it was really annoying: things would fail without error messages (let alone asking for admin privileges) all the time. Windows 7 is fine though: it feels like Ubuntu to me: I can run it as an ordinary user and it prompts for an admin password when necessary (and the occasions when it is needed make more sense: in the XP era I often wasn't convinced programs really needed the privileges they asked for).
It's not just popularity. "Admin by default" and "Easy-to-use over everything" is what doomed Windows. In *nix you always had to exploit bugs, in Windows you hadn't to. Nowadays Microsoft has built layer of abstraction over layer of abstraction to fix these previous decisions, but I think that such complexity has just made exploitable bugs more likely. Moreover, according to Secunia, in mainstream Linux distros every security bug gets fixed eventually. No such hope comforts Windows user.
See above. Why, besides vanity, does superuser matter to a virus?
The idea that Windows is harder to update than Linux will come as a surprise to enterprises who have been getting autoupdated fixes for almost a decade now.
Given that my hobby used to be exploiting various overflow exploits in Linux machines I agree with you, but do you think there are some things that Windows lagged on that hurt it? For example, there are a couple things like ASLR, NX/W^X bit, and stack canaries that I think they should have rolled out sooner. Do you think that made a difference or were SQL injection et al so easy by then that there was no point in bothering with overflow attacks if your goal was to make money and get information?
Edit: Ah, and I forgot: Windows ACE's are pretty much as good as NFSv4 ACLs but Linux still doesn't support anything other than basic POSIX.1e ACLs out of the box.
My perception is that at WinXPSP2, where Microsoft finally got serious about runtime protection, the state of the art in mainstream Unix deployments was not that much better. How resilient was Solaris to overflows in 2003?
I think you're right; most UNIX installations weren't that much better (Linux especially). I think Solaris may have been one of the best simply because they were running on SPARC procs and the SPARCs have had optional NX support since '98 or so. That still relied on the admin enabling it though (so basically no one had it enabled).
AFRIK, NX support was added to 32-bit Linux and Windows XP SP2 around the same time. Unfortunately, it required PAE. XP's bootloader could autodetect and automatically load the PAE kernel, Linux's bootloaders couldn't and so most people still ran with the non-PAE kernel and were thus not taking advantage of NX. This was made worse by the fact that Intel's early Pentium M processors did not have PAE. Finally years later some Linux distributions added auto-detection to their installer.
While I agree on all OSes being exploitable and therefore on despicable "smug Unix weenie attitude", I still think that while *nix were just exploitable, Windows "welcomed" viruses (think about ActiveX, autorun, administrative privileges by default and such).
There is a misunderstanding about updates. I was not saying that Windows is less updated, I was saying its security holes remain exploitable for a longer time, if they get fixed at all. Compare Windows XP (read section "Most Critical Unpatched"):
The other thing the Windows ecosystem has that makes it more hospital for executable file viruses is a culture of user-to-user sharing of binary executables. In the UNIX world, sharing source is the usual vector for copying programs user-to-user.
This was particularly true back when executable file viruses were at their most prolific - back in those days, if you copied a game from your friend at high school, that binary was quite likely to be several tens of generations removed from the original source. Each generation was an opportunity for a virus to climb aboard. With internet distribution of illicit wares, you're much closer to the original source.
> and if it would've had a plural, it would've been 'viri'
That's not immediately clear. Depending on whether you think it's second or fourth declension, and masculine or neuter, the various possibilities are virua, vira, virūs, and viri. My best guess is that the "correct" usage was one of the more exotic varients (virua or vira), but the word was so rare that many people didn't learn the nuances, and instead adopted it to the more common patterns. Similar to how "begs the question" is often used incorrectly, and that ends up becoming an accepted usage.