I wondered if you could sneak in some unicode digit but it seems to reject those too:
$ go run z.go
# command-line-arguments
./z.go:6:2: identifier cannot begin with digit U+0661 '١'
./z.go:7:27: identifier cannot begin with digit U+0661 '١'
> The other one was the time I was speaking to my brother in law, who had just paved his driveway, he said "I could have used airport grade tar, but thought it was too much" and we were in front of his Nest security cam is the only thing I can think of, but the very next morning, I'm scrolling through Facebook, and sure enough, someone local is advertising airport grade tar. Why? I didn't google this, I only heard it from them.
Option A: The Nest camera not only listened to the conversation and picked out "Airport Grade Tar" and decided it needed to show adverts about it to people, but the camera also identified you to the point it could isolate your FB account in order to serve you those adverts.
(I'm making some assumptions but...)
Option B: Your brother had done various searches for airport grade tar from his home (in order to know how expensive it was). You, whilst visiting his home, were on his Wifi and therefore shared the same external IP address, your phone did enough activity whilst at his house (FB app checked in to their servers in the background, or used Messenger, etc) to get the "thinking of buying airport grade tar" associated with his external IP address associated with your FB account that was temporarily on that IP.
I had a friend who was convinced that some device in his house was listening in on his conversations with his wife as he kept on getting adverts for things they'd been talking about buying the day before but he hadn't searched for. (But she was searching for it from their home wifi, which is why it appeared in his adverts afterwards.)
Option C: no cameras or crude wifi tracing needed; they know who you talk to / associate with based on location data and the full profile of both sides, and can estimate things like 'will have mentioned X' -> can dispatch that via heuristic like 'show ads for X thing that was also mentioned by someone adjacent on that social graph'.
That is, BiL was marked as 'spreader for airport grade tar' based on recent activity, marked as having been in contact with spreadee, and then spreadee was marked as having received the spreading. P(conversion) high, so the ad is shown.
It's just contact tracing, it works well and is really easy even without literally watching what goes on in interactions.
Some international flights arrive in to domestic US terminals. These are from a limited set of countries where passengers have cleared US immigration in the departure country.
Canada, Ireland and the UAE are the major three, plus Aruba, Barbados and Bermuda.
I would expect that most nations are performing some kind of surveillance like this.
Finding people who serve on carriers shouldn't be difficult. That kind of information can be plastered anywhere over FB or similar. Many of their friends will also be active in similar roles.
Then find associated Strava accounts. Find more friends that way.
The information you can gather is useful on many fronts. Someone does a few runs a week on shore and then suddenly stops? Could be injury, could be that carrier has sailed. Have many of their "friends" who also serve there also stopped logging things on dry land? Do any of them accidentally log a run out in the open ocean? This kind of patchy unreliable information is the mainstay for old-school style espionage.
Strava Labs beta "Flybys" site used to be a great source for stalkers. You could upload a GPS track (which can easily be faked in terms of both location and timestamps) and see who was running/riding/etc nearby around that same time. The outcry was enough that it was switched to being opt-in (in 2020 I think) but for a while all of the data was laid bare for people to trawl and misuse.
Sure, but many people want to use Strava for more than one purpose.
a) Analysis and tracking of your own personal goals. (Some of the tools are better than the stuff available on the device itself.)
b) Sharing and socialising some other activities.
You can be careful and only allow certain activities to be public but you'll make mistakes and eventually many people will just think "whatever, I'll just default to public and remember to hide the ones I don't want to be public" and then it's even easier to make mistakes.
Defaulting to "opt-in" is all well and good until a human makes a mistake.
imho with unusually sensitive things like precise location data it could just not let you opt-in to making it all public, and make it much easier to share with a specific named friends than to share on a public directory
I really don't understand these criticisms of Strava, it has excellent privacy controls so you can share as little or as much as you want. You can already choose to share your activities with only your friends (followers). Or keep your activities private or hide the location data.
It does but my point is that your settings are applied to all activities.
Here's a few examples that might help demonstrate my point:
I used to do parkrun regularly. I had no problem sharing my Strava activities for parkrun because me doing it wasn't a secret, nor was the location secret, nor was my time secret. All of these things could be found from the parkrun website once the results had come up. John Doe was at this location at 9am and ran this route with 400 others in a time of 26 minutes or whatever.
I was also part of a cycling club that did a regular "club run" on a Sunday. 5-15 of us all doing the same route. It was good for club morale for us all to upload our rides to help show how popular it was and encourage other club members to come along. They could see that we weren't going at a silly pace and that we stopped regularly to regroup as we had riders of all abilities and speeds riding with us.
But then I also helped out with my kids running club at school, taking a bunch of 7-11 year old's on a 20 minute jog/run (depending on how quick they were) around the local area. This absolutely should not appear on Strava (public or not). The running club wasn't a secret (everyone at the school knew since they had the option of letting their kid do it) but that's a whole world of difference from having it public on Strava showing the usual start time, the various routes we used to take, where we stopped, etc. Privacy zones can help hide the start/end but that wouldn't help hide everything.
We just made sure that all of the parents who helped out knew that we shouldn't even record it with their smartwatch. I just used to create a manual entry of "Morning run" with approximate distance and time. That was good enough for my training stats.
There's no one privacy setting that handles all of this. Whatever setting you use relies on me to manually adjust the activities that don't fit that setting. The problem is that humans are fallible, so remembering to make it private or hide the location data isn't entirely reliable. You're also at the mercy of Strava (or whatever) not doing something stupid and accidentally making private data visible due to some bug, glitch or leak.
Right, requiring human intervention to share a run (other than maybe with eg a specific small circle of mutual friends) seems like it solves all those problems, other than perhaps being annoyed that you forgot to manually share a run.
But at least that's a failure you can fix once you notice, as opposed to making something public that shouldn't have been. Letting people opt in to automatically sharing runs to the public just seems like something designed to get people to share stuff without thinking about it.
I'm saying something a bit different: that even letting people opt in to sharing every run they track publicly is just asking for trouble. It's setting people up for their information to be made public when they forget to turn it off or that they turned it on in the first place.
Maybe "automatically share everything to the globe" should just not be an option for sensitive data like this.
I think this more about it coming from a higher authority than the school itself.
Many schools have similar bans but they don’t get support from many of the pupils or their parents as both groups have members that just believe it is the school choosing to overstep their authority.
Now it is a diktat from above it makes the school’s job in enforcing it much easier. They can just point to the relevant legislation/diktat and say that their hands are tied, if you disagree here are the places you can go to voice your opinion. Meanwhile we (as a school) have no choice but to apply the rules, etc.
Offsetting part of the punishment to the parents (having to get them to come in to the school to collect the phone) is going to help get the policy reinforced from home in most cases.
My kid’s school had a similar policy. I didn't mind having to go out of my way to collect the phone and didn’t pass any of that on to my kid, they were annoyed enough about having it confiscated that it only took a few times before they modified their behaviour accordingly.
Heh, my son goes to school in the next town over and I don't drive, which means it's either a 90 minute round trip by bus or £40 of taxi fares. I've made it abundantly clear to him that if I have to go into the school to get his phone back he's picking up the tab for my taxis.
Granted it won’t work for 100% of people but I’m sure it would work for lots of people.
Something as simple as a button you have to press to disable it is often enough of a barrier to prevent people from doing that as it makes the context switch from work to non-work more obvious than simply alt-tabbing to a different browser window.
reply