Step 1: Have the iPhone pop up saying "do you want <Pebble watch> to be able to send messages?" and let the user decide which devices can send their phone messages.
Step 2: Have the iPhone pop up saying "do you want <Apple watch> to be able to send messages?" and don't just assume "yes"
Both steps would improve security, even if they harm Apple's profits.
Ah, but you see, they need to go to the Apple store and buy an Apple product, then with no clicking at all the app will work.
If they go to a different store, and buy a non-Apple product, that's insecure. What they need to do is return it and go to the Apple store and buy an Apple product. That's secure. Give the money to Apple.
You're being sarcastic, but isn't this all just... correct?
Yes, I do trust the company that developed Secure Enclave more than I trust random BLE firmware in a $49 Alibaba watch.
More importantly -- my great-uncle can trust the same thing, because Apple has spent decades building that trust. Consumers generally should not trust random hardware. Apple is not random hardware.
Google, Samsung, Pebble, Amazon, Microsoft, Sony, etc. have also spent decades building trust and don't build random hardware. But that doesn't matter because Apple locks them all out and insists you remain within their walled garden where it alone profits from you.
If you don't want a future where you have to buy Apple milk to put in your Apple fridge (because the fridge stops refrigerating if you try putting any other brand of milk in it, citing "security issues") -- or worse, you can't get your Amazon fridge in your Apple house because it cites nebulous reasons and refuses to open the door - get out of the reality distortion field and accept that it is in people's interests for one item to work correctly with another, and to call venal vendors on their "oh but it wouldn't work or it wouldn't be secure" bullshit.
That’s not the point, though. Any method by which apple exposes APIs to Samsung, Google, etc:
- requires immense development effort and expansion of security surface area
- STILL offloads trust to Samsung, Google, etc
The hyperbole here is a little hysterical. Apple doesn’t totally lock out third parties. In the smartwatch example, it is a very specific set of features which involved passing data (which users expect to be e2e encrypted!) back to Apple. That’s an extremely hostile security environment! Product tradeoffs would absolutely have to be made in order to support arbitrary third parties! I don’t think it’s fair to just demand that Apple make their product worse without at least exploring the balance.
Anywhere Apple trusts itself is a place where they can trust a third party.
Anywhere Apple wouldn't trust a third party is a place it should not trust itself either.
It doesn't even have to be arbitrary third parties, it can be Apple's chosen third parties. But they'll choose nobody, because they love lock-in too much, and they'll tell the rubes that it can't be done or it's too hard. That's just bullshit, and they know it. They do it to lock out competitors, so they alone can juice their existing users.
The only thing that can open up Apple is regulation -- and as we've seen in with Apple's spiteful attempts at compliance with EU DMA rulings, it makes up arbitrary criteria calculated to maximally lock out and frustrate business rivals. It's like it's trying to come up with a compliance solution that the EU might accept but would result in as few competitors as possible able to actually use it, ideally zero.
> Anywhere Apple trusts itself is a place where they can trust a third party.
This quite literally could not be further from the truth, and to suggest that it is true reflects such a comprehensive misunderstanding of both the fundamental nature of computer security and the practical realities of the world in which we live that it's not really possible to continue the conversation productively
Android handles a couple permissions it doesn't want people turning on accidentally by requiring that the user open the settings app and manually pick which apps to allow from a list. I wonder if that reduces the rate of people enabling things unwisely.
>We have decades of experience that users will blindly click whatever prompts they need to make the app work.
Really, how is Apple protecting you from clicking Allow on a webbrowser if it asks permissions for WebCam and Microphone? I am asking since I do not have a Mac and really want to know how well are Apple users protected compared to Linxu users from web and microphone on browsers.
That's exactly how it works. Apple does the same thing everyone else does. But when Apple does it, it's "secure", and when everyone else does it, it's "insecure". Hope that helps.
Where do you draw the line between allowing functioning adults to make their own choices (even if they are mistakes) and tech paternalism?
Currently we seem stuck in a positive feedback loop where tech becomes more and more paternalistic which creates more and more tech illiterate users which is used to justify even more tech paternalism.
It is convenient that this tech paternalism also happens to align with the profit incentive: Easy to trap people in closed ecosystems this way.
You're getting dumped on here but you're absolutely right. Anyone who has been in software for any amount of time knows this, too. HN is full of software developers--downvoters should know better.
You can put a button in your app that says "Tapping this will drain your bank account and give you cancer" but if it also enables functionality that the user wants, they will tap it.
Sounds like a "make better warning messages" issue.
Most users are not able to root their device due to the number of steps needed and will give up on an app that needs root access. Make it so that you have to do something other than just clicking a warning message to enable using your Pebble then.
Warning messages can be made idiot proof with some thought.
If Apple had their way, they would LOVE to sell you a $2000 aluminum brick with no screen, speakers, microphone, etc., that still required a proprietary cable to charge.
A set of four castors. Like the bottom of a shopping trolley. Yours for $699
Tell your friends! "Each castor costs one hundred and seventy five dollars. It costs four hundred thousand dollars to run this computer... for twelve seconds. Ah ha ha ha ha ha ha!" (https://youtu.be/jHgZh4GV9G0?t=19)
Don't forget that the founder used to work for Apple and used that fact to demonstrate their expertise. Are we sure Humane Pin wasn't Apple testing if this was possible? (I'm joking. Apple already knows the answer)
I absolutely hate these sorts of nagging popups and I’m happy that a vendor I already have to trust doesn’t pop them up when I acquire a new product and sign it into my Apple account.
Imo, if this were to happen, it should happen by allowing devices like the pebble watch to sign into an Apple account and acquire permissions through that process rather than nagging on my phone on pairing.
I think I was pretty clear. You setup your pebble watch via openid connect/oauth like any other API client. No nag popups, manual Bluetooth pairing, etc.
Only if you consider connecting to a single phone. The advantage of what I’m proposing is the watch would automatically work with whichever other Apple devices are on your account.
Step 2: Have the iPhone pop up saying "do you want <Apple watch> to be able to send messages?" and don't just assume "yes"
Both steps would improve security, even if they harm Apple's profits.