I think this is some of the disconnect between users on this site who don't understand why people would possibly actively choose iOS even with the lockin. Most people do not want to use a product that is constantly updating and adding features in that are going to be taken out shortly, for something as basic nowadays as web browsing. If you're on the cool new technology people the average person likes rapid innovations. Once your tech is considered a commodity, then the average person wants a the equivalent of a toaster. Most people do not want to fiddle with their toasters or find out each day what new changes their toaster has. They want to put in bread, and get out toast. For browser, most people want to just be able to navigate to a handful of websites.
You are making one mistake here: people actually do want a different toast if its significantly better than the current one that burns their edges too many times.
Chrome was able to outcompete everything else on Windows, despite Microsoft trying to prevent it. People actively installed it, just because it was better, also the so-called stupid users were able to install it.
If this was possible on apple devices, who knows? Maybe it would have been the end of safari.
Users just want toast, but they do prefer great toast above mediocre toast and will spend some effort on getting it.
I agree broadly with your point but the subtleties have to be kept in mind.
The main difference between the Chrome/Windows and the current Apple situation is quite significant imo. Even though Microsoft tried their best to keep user share with IE like sneakily changing the defaults, installation of Chrome was always easy. Downloading and installation of Chrome was always an option and more importantly didn't require Microsoft's approval. That's not the case with iOS/iPadOS right now. A Chrome-like app couldn't be downloaded because Apple's not going to approve the app to be put on the App Store.
Also, from a user perspective, Safari is a good browser which manages to keep up with both Firefox and Chrome. This combination means that users don't really have that much of an incentive to switch.
To use your toast analogy, IE used to burn my toast to a sooty, carbonized block. Chrome used to do it perfectly. That's a pretty big differential in terms of performance. Now Safari isn't as bad as IE was in that era. So if it burns the edges every now and then, it's not really that big a deal simply because the user experience is more or less comparable to Chrome.
Users just want toast, but the difference isn't between great and mediocre anymore like it was with Chrome and IE. It's between very good and very good-ish.
This is precisely why I'm okay with Apple not allowing side-loading even though I would prefer greater choice. Apple's ecosystem isn't fundamentally as rotten as the Windows platform was in terms of performance. In fact, it's actually pretty good. Could things change? Sure. But as long as the user experience isn't shitty, I'm okay with the status quo.
No, they just focus on power consumption more than the other browsers (they also have the advantage of not needing to work cross-platform, making this easier). This is often why Safari is behind other browsers on features: a focus on quality of implementation of quantity of features.
I also would want a source for that. At least in the mobile dev space, it has always been common practice for them to leverage private APIs to do things regular app devs cant.
> At least in the mobile dev space, it has always been common practice for them to leverage private APIs to do things regular app devs cant.
On mobile (iOS), yes. But power consumption comparisons tend to be on desktop (macOS) because the other browsers can't run on iOS at all. And I don't think Apple can enforce that API privacy on macOS because they unlike iOS they allow apps to be installed from outside the App Store.
Or if you want a more uncharitable take, Google used its position in search, email, videos to keep nagging people to install Chrome in order to be able to get performance benefits.
Is it really a feature to not be allowed to use non-webkit Chrome on my iPhone, even if it might unlock superior features? Is it a feature that Google Fi can't work optimally on iPhone thanks to their anti-competitive restrictions? Is it a feature that I have to use Siri, which is way inferior to Google Assistant, when communicating in CarPlay?
Is the planned obsolescence a feature too? My 12 year old Mac laptop is dead because MacOS does not allow me to upgrade past a certain version, and the cryptography primitives do not update past that so no recent browsers support the platform. My only recourse is linux. In the future this will not be possible either because normal Linux doesn't support the new hardware, unless Asahi Linux really manages to take off.
I choose iOS because the hardware is good and _some_ of the software is good. I can't stand their business practices, and they don't actually get a lot of things right. They're just doing what MSFT wanted to do with their OS & software, except that they've monopolized the hardware too and somehow that's what protects them from antitrust.
> My 12 year old Mac laptop is dead because MacOS does not allow me to upgrade past a certain version, and the cryptography primitives do not update past that so no recent browsers support the platform
The bigger issue for me is increasing requirements. My 2012 MacBook Air went to absolute dog when upgrading to Catalina. Terrible user experience (which, I believe, is one reason why Apple sets requirements the way they do).
It was either snow leopard or lion. Chrome and Firefox would install but be unable to visit https websites because of missing crypto things it needed. It was surprising to me too.
For the average of users they don't really mind if they're using WebKit or whatever stack underneath. My family just wants to open Safari and browse, and they don't give a shit about any of Apple's business.
Right. They only care when it starts being noticeably slow/laggy, or causing other problems, or if a button/menu item they often use has changed positions or removed altogether. "I've been clicking this spot for years and it always works, now its completely unusable!".
Take for example the iMessage feature that lets you see message timestamps. There are no visual affordances to indicate how to do it. My parents couldn’t figure it out (pull the messages left!) it’s about the same as not having the feature. Wanna install an alternate sms app? Good luck, apple won’t let you because this isn’t android.
So sure, people won’t complain — they just accept that this is life, but life could be better if apple didn’t have monopolistic behaviors.
That's usually because they never used or even know the existence of something better. Same thing was being said 15 years ago for IE. The average user was fine with it until they tried Chrome (or Firefox/Opera before).
Other than being a professional programmer, which I keep pretty compartmentalized from the rest of my life, I think I'm a pretty average computer and web user.
I certainly know other browsers exist and I have even used them. I still don't care I just use what comes installed on the machine because like I said I don't care.
Hm even my non tech classmates in college knew the difference between chrome and safari. They care about speed and features they get used to (e.g. chrome password manager follows you to your phone, syncs history across devices). To be an absolute non user of such features to the point of not caring is not what I’d consider typical.
>Is it really a feature to not be allowed to use non-webkit Chrome on my iPhone, even if it might unlock superior features?
Yes it is for many people. They do not want to risk the chance of their tool not working for some nebulous chance at a better experience. The vast majority of people are not tinkerer's at heart.
The rest of your questions can all be answered with "Yes", as long as the tool does as expected. We are literally all in the 1% of people just to begin with by even configuring software in the first place.
The example of Chrome on Windows given above seems to belie this claim. The "nebulous chance to use a better product" you refer to was apparently enough to trickle down to the non-tinkerers. Who's to say the same wouldn't be the case for eg Google Assistant vs Siri?
It’s not that here. I use iOS because I don’t want anyone embedding several bloated half baked ancient hole ridden security nightmare browser engines in their apps which do everything possible to bypass system wide network restrictions so they can carry out whatever bad behaviour their business model thinks is acceptable. I want one system wide browser that respects the security configuration at OS level.
I want this for myself and the surface area of our 500 or so staff.
It is hard for me to disagree, because I understand that use case. The issue, however, appears to be that there is no 'version' that allows for it for users that accept the risks involved.
It is not completely unlikely the conversation with cops. In their perfect world, everyone would sit quietly at home with hands on the table. Life is more messy than that though for obvious reasons. In other words, I understand the need to have some level of control, but some balance is necessary.
For those cases, there is Android. My youngest one has an android phone, and everyone else in our family has an iPhone. It is not like you are forced to buy an iPhone. Those restrictions are actually part of its market differentiator.
No. All that happy talk about differentiation and market forces only makes sense when there is an actual competition. We have an effective oligopoly with no real choices between them. If there is no real choice, as consumers we have to force companies to adopt more consumer-friendly posture ( and yes, that means sideloading and all the dangers that entails ).
An oligopoly is not a monopoly. You can argue that there needs to be more differentiation, not that there isn't any.
Some people want to sideload, and they can have an Android. But some other people are pretty much satisfied with not being able to sideload, the uniformity it brings, and the other perceived positive side-effects they see.
What did I say though >> "We have an effective oligopoly"
I did not argue we have a monopoly. I argued we have an oligopoly at best ( if you count Pine and similar as viable candidates ) and duopoly at worst, which somehow manages to be worse than a monopoly for one reason and one reason only.. monopolies are more tightly regulated. Try talking about regulating current batch of market leaders and you will only hear 'private enterprise','if you don't like it, start your own', which completely manages to ignore the problem to begin with.
<< Some people want to sideload, and they can have an Android.
Hmm. Why is that statement somehow appear axiomatic why and does each sentence fragment logically follows one another? Why is it not 'some people want to use their purchase as they see fit so any device they pick they can do what they please with including "sideload"'?
<< some other people are pretty much satisfied with not being able to sideload, the uniformity it brings,
I would argue with that.
One. Not all users know it is an option.
Two. Existence of various workaround to allow sideloading suggests otherwise.
<< the uniformity it brings
If there is one thing world needs now, it is not uniformity.
<< other perceived positive side-effects they see.
The point of open OS is not to force its users to use unvetted apps. It is to give them an option to do so.
The argument that someone might want to use a restricted OS because they want to only use vetted apps is flawed. The reason is that the users can use an open OS in the exactly same way, if they choose to do so.
>The point of open OS is not to force its users to use unvetted apps. It is to give them an option to do so.
And the point of a closed OS is that "giving the option" is as good as forcing the users.
Because third party devs with clout (facebook, instagram, etc and generally anyone whose apps users "must have", even if it's just their bank or local dominant rideshare or chat app) will, given the option, force users to install unvetted apps.
> "giving the option" is as good as forcing the users
From what perspective?
> third party devs ... will, given the option, force users to install unvetted apps
It is up to the users to allow e.g. Facebook application to install additional applications. No one, and especially no app, should be able to force them to do so on an open OS.
If you refer to predatory practices where an application would disable some of its functionality until an additional "unvetted" application is installed, then this is definitely an issue.
But it should be addressed by targeted measures against the offending applications and by educating the users. Not by taking away everyone's freedom.
In my opinion, the freedom of choice regarding which application runs on one's own hardware should remain with the owner to the largest extent that is practically possible.
>If you refer to predatory practices where an application would disable some of its functionality until an additional "unvetted" application is installed, then this is definitely an issue.
Worse, there wont even be a vetted application. It will only be an unvetted application you'll be asked to sideload to get the functionality at all.
And soon: alternative app stores, with their own rules and no central control. From Adobe, Google, and so on.
>But it should be addressed by targeted measures against the offending applications and by educating the users
Has that "educating" and "targeted measures" ever worked?
I do not understand your reasoning. Any app store or in general any source from where an application could be installed implicitly provides a trust model. Whether it is Google's Play Store, a GitHub repository or e.g. John Doe's personal website. The users then have the freedom to choose whom they would like to trust.
> Has that "educating" and "targeted measures" ever worked?
Yes. Users are cautious before running anything as root. They check a browser addon before installing it. They do not open media from unsolicited e-mails. And so on.
It is, however, all besides the point.
Even if there currently were no effective targeted measures that could be used, it still would not mean that we should resort to undermining everyone's freedom, in my opinion. Not in this particular case.
There are legitimate cases where everyone's freedom is restricted in order to make the society a better place. For instance, we as a society have chosen to not tolerate stealing and we put the individuals who do so behind bars. This is broadly accepted because it brings benefits to everyone. We do not need to worry about being stolen from that much.
However, there is no general benefit to the society that I am able to see which would materialize as a result of limiting everyone's freedom to use their own devices. I can only see a lot of negatives.
That would be cool... except that Apple in fact approves intrusive monitoring apps--hell: they even have an Enterprise program that lets companies build apps for their employees that don't require App Store approval!--and then makes it so that not only can they be installed but the device is so locked down that you can't tamper with them, via Mobile Device Management.
Apple still need to approve the organisations use of that program and provide certificates, which like in the facebook case can be revoked.
There are also two different tiers of MDM in the apple enterprise program. If its a bring your own device the device cannot be locked down to not let the users remove applications. It will also sevely limit the kind of information the MDM solution cant get out of the devices. These things can only be enabled on corporation purchased devices.
I am a user of browsers and though I like it the most the Firefox drives me crazy with the constant updating notices. I will never let any program auto update itself if I have a choice - sorry, I like to know if something is happening with the things I use - but the frequent bugging of the Firefox to update, update, update, update, update, update, update, update, update, update, update, update is driving me mad! Why the f*ck update?! The current version works just great!! One or two brought me noticable and interesting (thought still marginal in the overall purpose) changes throughout a perhaps 5 years window, the rest just bugging me and pushes self advertising 'what monumental new things we brought to you' pages on next start. Breaks the flow of use frequently, it is a user hostile behaviour. Do security ONLY updates (or better yet, work slower and bring a secure version to us in the first place please!) and let me look for new features on my own time, ok?!
(this is a very bad trend in the software industry, pushing updates on the user like if the software had the same importance to the user like it is for its developers, if the life's purpose for the user was to keep it the most updated and satisfy the software packages' needs, no, not at all, for users it is a tool not the center of admiration that want to nurse and cherish. it is a bad trend but that is not an excuse to do bad as well but a reason to make it better please)
Sorry if you've considered this (or if you just wanted to vent) but maybe installing Firefox through a package manager would help you control when it updates.
The primary "new features" of a new browser release are (A) security fixes and (B) support for new web standards, which websites will adopt. If you are not a security expert or a web developer, how are you supposed to judge whether these new features are something you want to upgrade for? You probably won't even know that you're missing these new features-- if you wait long enough to upgrade, you might fall victim to a long patched vulnerability or your web experience will start to break in subtle ways. This isn't a flaw of the web but a feature, we (the web developer community) have fought long and hard for the browser developers to stay up to date with the web's changing feature set so that we have less of a "feature window" to target.
But new standards in every 2 weeks? You are kidding, right?! : ))
And security updates as well every other weeks?! (or sometimes two in one day or in three days in between: just checked the update history) Are the software products in THAT poor quality, really?! Then those should not be released yet! Those are THE vulnaribility then. It is a scam then if this is deliberate - releasing poor products before securing just to occupy themseves for long long time to come finishing it, make it properly.... Maybe shouldn't include those standards in a heist, making thing even more complex so it is a hopeless battle agains the security leaks that make those products look like a sieve. Shouldn't rush things if ends up this badly that frequent and important and urgent patches are required.
Also no! Not only security fixes and new standards come, you are mistaken quite a bit. There are no separate updates for features and security. I checked. New features come bundled on how to do things differently, adding things as well. Bookmarks or whatnot - welded together with important stuff -, I do not care as I mentioned the product in features is good already and never missed those yet I am bugged against my will to get those too (good enough only means on the surface of course, not in security, in security it seems to be hopelessly and constantly below the minimum since urgent, immediate, install right away or you doom kind of messages come from here and there). Sometimes they even repackage, redesign, put elsewhere existing and frequently used things breaking the user flow again, making its usage more difficult for a time being while you get used to it (if you can, not always) just to ruin it yet again later. It is good it is free for the end user, for that sloppy quality breaking the usage and requiring frequent fixes no amount of money should be given. You woldn't if it was a car or a tool for the physical world.
> But new standards in every 2 weeks? You are kidding, right?!
There are hundreds of changes to standards in various stages. And bugs happen in those implementations.
In fact, standards related changes land in browsers source trees _far_ more often than every two weeks, that's just the iterative development cycle that they package these changes (_and security changes_) into.
Excuse me, your conclusion doesn't seem to follow from your premises. I mean, "Please OS vendor I want something simple. That hidden `sideload` option is complicated, please lock down my machine."
Like, seriously? No one's that stupid. Sure most people want a decent, clear default option, but no one want that to be the only option. At least not for themselves. Because for some reason I see many people arguing that other folks are too stupid for options and should be locked down instead. For their own… safety? convenience? And that just reeks of authoritarianism.
They barely even know the difference between Windows and Google.
People who got online when smartphones entered the scene have a difficult time when I try to explain what "folders" and "files" are.
I do a lot of tech support for family and friends.
They NEVER enter settings or preferences for their OS or their browser.
They're afraid they'll "break something".
The disconnect between tech literate people like you, and what they think most users want - or even care about - is mind blowing.
Does anyone here remember the headache you'd get, when helping a family member, and you saw how many toolbars and viruses they'd managed to install, since you last checked their PC two months ago???
The lockdown is a feature - not a bug - in most user's minds.
And I has made my life so much easier.
I use Linux and side load apps.
But I'm so, so, happy that none of my family members are even able to do the same.
Right? I feel like my non technical social connections have an even greater learned helplessness from interacting with any sort of open to customization technology because they've learned that everytime they touch something the tool stops working in a way they don't understand. Most people are not going to spend days to weeks, and definitely not months to years learning how to tinker with and expertly maintain their technology.
I feel like technologists in this forum are acting like blacksmiths who would scoff at any of us for having purchased a hammer, rather than smelting the ore, forging the head, and carving the handle so we could have one that fit our needs perfectly.
I dunno... the way it feels on the other side is that y'all think people are too dumb to not hurt themselves with hammers--which is true!!--and so, rather than trust that people who are afraid of hammers will simply avoid using a hammer they should be actively prevented from even owning a hammer, or even letting their friend or a hired carpenter use a hammer to help them, which is kind of overkill.
Well, it's a fact that all that technology is incredibly brittle. Systems lack resilience, error recovery, and accessible debuggability, and when something breaks, there's a high chance it'll have disastrous effects. It's objectively safer to stay within the "works for me" happy paths that authors are likely to be actually testing/using themselves. Even this sometimes fails, sometimes seemingly without reason, only to later (maybe) start working again. It's a nightmare, a constant source of stress and another thing that people feel they have no control of at all. It's not strange users flock to authoritarian-style environments, managed by someone who do have the capability to control that chaos to some extent - even if they sell users' PII data to Sunday and back.
There are complex reasons for this, but the end result is simply that IT is not ready for mass adoption. Software is still in its infancy - I suspect that the broader the possible implications of technology, the longer it will take it to be ready to be mass adopted. We gave up all hopes of ever proving program correctness in the 80s, then in the last decade we've given up all pretenses that we know what we're doing... and nobody saw a difference. By all rights, software should be confined to research labs and garages of nerds for quite a few more decades.
The problem is that this technology is too useful. It has too far-reaching applications in almost all spheres of human activity. When the software (and all layers below it) actually works, it brings small miracles to its users, enough that they're willing to pay a lot for a product obviously unfinished, rushed, that'll probably get killed after few years. They think that, yeah, it breaks all the time and I'm afraid to breathe in its direction, but it's ok, I'm strong, I can deal with it if I'm able to do X or Y.
Tl;DR? I dunno. Maybe developers should put more effort into professionalizing the field, but this kind of thing is impossible to rush. Or maybe the users should get a grip and accept that it's not developers who force them to use their products. The massive amounts of money involved, along with the life-changing potential of IT products, skew incentives so much that, currently, both developers and users pretend that it's all fine, even though it obviously isn't, and then both complain. Users are stupid, developers are lazy, but neither can live without the other any longer...
I wholeheartedly agree; locking down systems is a feature, not a bug.
I would go even further and say it's not just so in the users minds, it is also so in the admins mind, whether that's a business setting where we have to make sure thousands of workers don't accidentially brick their PC (or worse: cause an infosec issue), or a family setting.
Though I have to say, that lockdown-feature comes with a rather heavy price tag attached, because, well, the systems in question do a whole lot more than just make the locking down easy, do they?
It would be great if commonly used Linux Desktop Environments allowed for a switchable (with root-privileges) "Lockdown". I'm aware that this is possible already, but requires too many steps and is too error-prone. What I want is a simple on/off-cmd offered directly by the Desktop-Suite for me to issue as root.
That would allow people like you and me to setup computers for non technical people to use easily, whith the benefits of both an open system, and the stability a locked down system provides.
The important thing to notice is that the median user's mind is neutral about features like lockdown, security, side-loading, and everything else, because they don't think about features. They think about concrete interactions, like "playing candy crush", or "talking to grandma/grandkid", or "buying stocks", or "trying that app that my coworker showed me".
And when they can't do it ... "it didn't work for me" ... they, ironically wisely, don't even speculate why it did not work. The folks you see on forums who recommend "doing factory reset and it'll work" or "clean the cache", etc... are obviously the "Dunning-Kruger poster kids".
Median users are monkey see monkey do, that's why if they see "it works on Ted's iPhone" then their thought is "I guess I'll get an iPhone". And ... it works. The US is iPhone-land.
...
And interestingly this hyper-pragmatic (arguably too narrow-minded) approach to technology is also what leads to the interesting cases when enough teenagers want Fortnite on their iThing. And that's when the generalizer machine of society can pick up this thing and sometimes it spits out useful principles. (Mostly we get just one more bad statute on the books.)
Apple is only about half the US mobile market, and the rest of the world the trend is clearly Android. So calling the US market the trend setter seems odd because if that were the case then the rest of the world would be trending strongly iOS but this pattern has been stable for years.
>"Please OS vendor I want something simple. That hidden `sideload` option is complicated, please lock down my machine."
I could easily turn that characterization around. "Please OS vendor, I want something that just works, please don't add customization options that I will never use but if I accidentally select will effectively brick my machine for the technical skill level I have." I might be dating myself here but do you not remember all the complaints about something as simple as resetting a VCR's clock and parents just living with it blinking after a power outage until their technically inclined children took it upon themselves to fix it?
>No one's that stupid.
They don't have to be stupid, but that doesn't mean the time invested in learning the skills to modify and customize software and hardware is something they want to do. My original point was that people in this thread and others across this site keep talking like they cannot _fathom_ why someone would choose a locked down version of a product over an open one, and I pointed out why many people would.
>And that just reeks of authoritarianism.
Moralize elsewhere here. No one forced people to buy an iPhone and yet its massively popular. More open products exist for cheaper. Tbh the only who seems to be implying that people are dumb and pushing authoritarianism are the group who keeps pushing to break up these machines that just "work" that the market is showing a high level of preference for
You have chosen one property of the iPhone and elevated it to the primary reason why people buy that device, which is entirely disingenuous as clearly people are forced to make tradeoffs in their purchases and nearly 100% of people might despise having locked down devices and yet still buy an iPhone if other things are more important to them... and there are definitely a ton of things the iPhone gets right--both in its hardware and its software--despite* this one glaring thing it gets wrong.
Almost everyone I know owns an iPhone... and yet, almost all of them wanted a more open device and bought an iPhone anyway because it has a longer serviceable lifetime (due to software updates for a longer period of time), has a pervasive brick and mortar storefront that sells accessories, lets them use AirPods, has one of the best cameras on a mobile phone (and here it is maybe-interesting to note that Samsung devices are also pretty damned well locked down: you sideload a browser, sure, but you can't get filesystem access or modify any of its stock behaviors)... I could keep going, as Apple is actually an extremely competent company that has built a great product!
And yet, those people, when given the chance, were all very excited to jailbreak their phones to get more features. The people who are not technical has their technical friends do it for them. The people who did not have technical friends who wanted to deal with that much effort bring their phones to the little shops which do it for you. At its peak, despite being pretty hard to do, difficult to maintain, and complicated to take advantage of, we had more than 10% of people with an iPhone jailbreaking! That is an insane number of people to just write off :/.
People, especially when looked at them in general, it turns out, don't care about general principles. (And sort of rightfully so, because in the last thousands of years almost literally all questions of importance were not questions of principle. Which noblehouse should give the new king? None!? Yeah, that's not a good option. And so on.)
Software and these hyperglobal platforms are where principles actually start to matter. (It did not matter even in politics. In principle free speech, sure, but also no generated kiddie porn, because ew. And it did not matter, but suddenly with the capability to force CSAM-detection on the world, we would think it now starts to matter, but no, people in general never heard about this, have no idea about this, and so on.)
So with that long intro, it turns out that if enough people want Fortnite on their iThing, or want to repair their tractor or car ... that can force some principles, and that's when people will suddenly care about "my device my blablabla".
> do you not remember all the complaints about something as simple as resetting a VCR's clock and parents just living with it blinking after a power outage
Why fix it? What's the drawback in leaving it blinking?
They could fix it, they just didn't bother. Kids just have more free time.
Unless you're very lucky, it's going to be showing the wrong time. I don't think I ever had a power outage precisely on 00:00. Blinking is actually a nice reminder that you should not look at that clock to check the time. In any case: yeah, you have a single source of time info disabled, but there's a wall clock above the tv, so really, why bother setting the time right?
Unless you want to record that show that plays at a time when you have to be out. That's when you call out to the kids to do whatever it takes (setting the clock is one of the things to do, but why bother with the details).
The restriction is against programs that can download and execute code from random places on the internet, and you only need to look at a family of Android malware that Google has been unable to keep out of the Play Store to see why.
>Known as Joker, this family of malicious apps has been attacking Android users since late 2016 and more recently has become one of the most common Android threats.
One of the keys to Joker’s success is its roundabout way of attack. The apps are knockoffs of legitimate apps and, when downloaded from Play or a different market, contain no malicious code other than a “dropper.” After a delay of hours or even days, the dropper, which is heavily obfuscated and contains just a few lines of code, downloads a malicious component and drops it into the app.
Apple requires all executable code to go through the App Store's vetting process. Apps that download code to be executed have never been allowed, which is why you have the Webkit restriction.
Webkit can download and execute code. Your app cannot.
The article's conclusion that users need to be wary of apps downloaded from inside Google's walled garden should be all the warning you need about the danger of allowing random apps to download and execute code.
>With malicious apps infiltrating Play on a regular, often weekly, basis, there’s currently little indication the malicious Android app scourge will be abated. That means it’s up to individual end users to steer clear of apps like Joker. The best advice is to be extremely conservative in the apps that get installed in the first place. A good guiding principle is to choose apps that serve a true purpose and, when possible, choose developers who are known entities. Installed apps that haven’t been used in the past month should be removed unless there’s a good reason to keep them around.
Nothing stops you executing arbitrary code on iOS, you just have to use an interpreter to do so, and Joker is in fact running interpreted code (a dex file).
Bear in mind one reason you may hear less about malware on iOS is simply that security researchers aren't allowed to sell products for it and they are blocked by the infrastructure from examining apps like anyone else is anyway, so they have no incentive or ability to figure out what apps are actually doing. On Android you can get APKs from the Play Store more easily, and APKs from third party stores very easily, and you're allowed to sell security apps into that market, so they have both means and an incentive to go find malware for it. Apple just point blank refuses to allow their commercial existence unless it's by selling vulns to Apple itself.
> Nothing stops you executing arbitrary code on iOS
Nothing except the App Store review process?
We already have tech sites warning Android users to beware of apps inside the Play Store, because Google has been unable to block Android apps from downloading malicious code and executing it on a regular basis.
If you're happy with that state of affairs, by all means, buy an Android device.
The point being made by the cited article is that a tiny interpreter that activates days or weeks after an app goes live can't be detected by any app store review process. You have no idea how many such droppers are active in the iOS App Store because only Apple can look for them, and nobody knows if they do or to what extent they do.
That's why both platforms also use a sandbox. The dropper still needs to work within whatever permissions the app has been granted. App Review doesn't involve a full blown security audit of your app's source code and then a deterministic build process on top.
> The point being made by the cited article is that a tiny interpreter that activates days or weeks after an app goes live can't be detected by any app store review process. You have no idea how many such droppers are active in the iOS App Store because only Apple can look for them, and nobody knows if they do or to what extent they do.
The point being that Apple doesn't allow information downloaded from random places to be executed as code in third party apps at all. This is literally the reason for the Webkit only policy.
Google does allow it, and they (very predictably) have no way to know if that code will be malicious or not.
Which is why Ars had to warn Android users that they had to be wary of apps downloaded from the Play Store.
I think we're talking at cross-purposes here. The issue is not what Apple allows, it's what they can detect and block. They can't detect arbitrary interpreters and therefore you have no idea if this is happening on the app store. You just have to take Apple's word for it that it's not. We're talking about malware, by definition it doesn't care what the rules are. Android is more open and so third parties can go investigate and find malware that uses interpreters to execute remote code, but Apple simply doesn't allow such explorations so we don't know what's out there.
My parents are not technical, I don't want to have to worry about their phones or computers, so they use iphones and ipads, and I have a pi-hole connected to their router that clears away most ads.
They like going to the app store and knowing there is at least some level of curation on the apps. If they suddenly have to start dealing with competing app stores and sideloading and menus and all that stuff their experience will undoubtedly be worse for it.
I think they'll visit sites and instead of being recommended to install an app, they'll be recommended to install an app with a redirection to a different app store.
Apple takes a huge cut of the money spent on the app store, and everyone will have a huge incentive to move users to different stores if they can. This isn't difficult to understand if you thought about it for more than a minute.
That would be apple's problem then for making their store so unattractive to developers. What you're saying here is that if developers had a choice they'd tell apple go fuck themselves and you're probably right. "We'll engage in anti-competitive practices so our customers can't escape our shitty offering" is not the flex you think it is.
I just want to ask you something. I'm a software developer, able to build out of parts and configure a headless Linux box, unlock the bootloader on, flash and root an Android phone and use both successfully, and "Please OS vendor I want something simple, ... , please lock down my machine" is literally THE reason why I got an expensive iPhone as my last phone instead of a cheaper Android.
If you have two otherwise-equal choices, but one has great default settings, but allows you to change things if you really really want, and the other has the same great defaults, but is locked down and doesn't allow you any choice, then if you pick the locked-down choice, I think you're frankly quite stupid.
What I think is really stupid is how many people seem to think that having the ability to change things means that they absolutely MUST go through all the settings menus and change things. What really galls me is how many so-called technologists even believe this. It comes up all the time in Gnome vs. KDE arguments, and I'm seeing the same mentality here. If you don't want to change things, then don't.
> think that having the ability to change things means that they absolutely MUST go through all the settings menus and change things.
That sure does sound like a straw man argument to me. Are you sure you've really heard "pls don't make me change all the things, I don't like changing things" instead of the much more plausible "pls don't make me use a system where someone had to implement (and has to support) functionality that I personally don't need, and as a (possibly) developer myself I understand that this adds complexity to the system and places burden on BOTH the user and the developer"?
It's not a straw man. People really do say these things. They complain that the existence of lots of configuration menus and options means that they MUST go through all these menus and configure everything themselves. Believe it or not. I've seen it time and time again, for over a decade, every time there's an argument of Gnome vs. KDE.
Not really, the choice will then becomes use chrome and deal with their JIT draining your phone/privacy invasive ads or you cant use this feature from google docs/slides/youtube being slow… etc
Bringing up the device performance and hinting at planned obsolescence of devices is probably not the best tactic here when debating android vs iOS.
Apple has a worse track record when it comes to that (while still allowing Google/android plenty of room for the same).
> I think this is some of the disconnect between users on this site who don't understand why people would possibly actively choose iOS even with the lockin. Most people do not want to use a product that is constantly updating and adding features
Precisely the reason why I have an iPhone and an iPad is because they give me 5 years of updates and even more of security fixes.
Firefox feels the need to scream at me every time it updates, which is all the time—far more often than iOS, and far more often than Safari. I think that's what the GP meant.
FF does the equivalent of a major iOS update several times per year, as far as the user experience of the update is concerned—iOS doesn't fill my screen with on-launch "hi! We updated!" notices unless the release is a big one and there's actually new stuff to tell me about, in which case they usually keep it short and to the point. FF opens & foregrounds a new tab to tell me about the exciting update (ugh) and sometimes also pops some call-out bubbles for Pocket or some other crap, way, way more often than any other software I use, and it's really annoying, especially for a piece of software I've been using exactly the same way since it was called Phoenix and before some of the people working on it were born, probably—none of the shit in those announcements has ever, once, been anything I needed or wanted to know about.
This, despite monkeying with UI and force-foregrounding unrequested content being UI poison for people with low computer literacy. I roll my eyes and close the crap; others get confused and are significantly delayed in doing whatever they actually wanted to do with the browser, on top of feeling confused and betrayed that their browser did something different when they opened it this time.
I don't remember the specifics of how to do it, but you can disable the Firefox upgrade welcome tab...
My Firefox install hasn't bothered me by opening any unwanted tabs in a couple years, at least? But it still updates on a system restart or when I tell it to -- it just reloads all the previously opened tabs without any UX burden.
But if you're browsing the web aren't you by definition going to "use a product that is constantly updating and adding features in that are going to be taken out shortly"? I'm always impressed when I realized a website did a redesign that basically moved buttons from the top of the screen to the bottom of the screen.
This is why I use Linux, where I can have a choice to run MATE for a decade while GNOME, MS/Windows (especially 11) (and Apple too) are destroying their systems (from my viewpoint at least).
the recommendation for Firefox Nightly aside, noone is forced to accept constantly updating apps, and adding features that get removed after a short time is also an unfair characterization. i don't update Firefox more often than i want to, and even when i update, i rarely notice any changes. big changes come less than once a year.
but i have the choice to update or switch browsers if i want to, and that's the point.
the restrictions only remove choice. they don't add anything.
I mean, I was referring specifically to firefox nightly in terms of adding features that can get removed. I confirmed before I posted the comment and it still a warning smack dab on their splash page. I use firefox on a daily basis personally.
I despise most of the things that Apple does and yet, I am evaluating switching to apple for phone more than 10 years on android.
The reason?
I cannot backup my phone data, I have to rely on the single app implementation and many of them have nothing about backing up data.
Add that I roughly get 2 years of security updates at most (I don't buy phones as soon as they come out), the consequence is having an expensive piece of outdated hardware.
I was going to suggest that Google stops implementing and releasing any new major Android feature, only release security updates. At least this way we would start getting security patches for longer...
I think this is some of the disconnect between users on this site who don't understand why people would possibly actively choose iOS even with the lockin. Most people do not want to use a product that is constantly updating and adding features in that are going to be taken out shortly, for something as basic nowadays as web browsing. If you're on the cool new technology people the average person likes rapid innovations. Once your tech is considered a commodity, then the average person wants a the equivalent of a toaster. Most people do not want to fiddle with their toasters or find out each day what new changes their toaster has. They want to put in bread, and get out toast. For browser, most people want to just be able to navigate to a handful of websites.
The restrictions are a feature and not a bug.