Hacker Newsnew | past | comments | ask | show | jobs | submit | luispedrocoelho's commentslogin

As a matter of law, the EC may be right, but as a matter of user experience, the idea that the Android market suffers from not having enough OEM-installed software on devices is a bit of a stretch. In all android phones/tablets I have seen there are a few shitty OEM apps that I have never found to be even close in quality to the google versions.

The EC claims that google have "denied European consumers the benefits of effective competition in the important mobile sphere", but I fear this will mean that cheap Android phones will have a shitty OEM-branded webbrowser and some random search engine link.


Yes some phones will be junk, but is up to the consumers to chose, not up to monopolists.


It's not clear to me if google is allowed to demand that non-google app phones not be called Android or if that would also be considered anti-competitive.

If they must allow Android branded phones that do not come with google apps, then the EC is making it harder for consumers to choose by disallowing the normal use of a brand to signal what it is that you are buying when you buy something. People may buy an Android cheap phone thinking that they are getting what is now an Android phone and end up with a shittier product.

Under a normal market, these OEM-branded phones would be sold as "ShitPhone OS 12.0" and it'd be clear that you have the option between paying slightly more for Android, a lot more for iOS or go with ShitPhone 12.0


But he did not purchase health insurance, he purchased travel insurance.

It seems to me that the whole thing is "I bought travel insurance and I thought I would be getting global health insurance, but it turns out it's really travel insurance."

I guess because the NHS is paid through your taxes it may not have been so blindingly obvious to someone in the UK that anything similar to what he is asking for would have cost at least "300£ per month" (and that is assuming he is young and healthy) instead of "300£ for 18 months".


> In reality, most claimants are going to spend much less than 1% of that £5 million limit before they get flown home. It only really applies to people who are at death's door and end up with a long stay in intensive care, or need a fully-staffed medical flight to get them home.

Exactly, but you're saying it like it's a bad thing instead of "this is just what travel insurance is and why it's so cheap."


I’m saying it’s both of those things. I’m grateful to the author that I found out this way.

I still think it’s marketed in a deliberately misleading way.


From his telling, the company was not very communicative when he contacted them, but, fundamentally, saying he should fly back to the UK for treatment and they'll cover the flight does not seem unreasonable.

He complains that "unless you are injured so horrifically that you cannot survive another moment without life saving surgery, odds are they’ll just tell you to fly home and have the surgery for free." But, to me, this is exactly what travel insurance is (which is why it's so cheap). Expecting fancy Singaporean hospitals unless it is strictly necessary seems totally unreasonable.

If they were not so communicative, he also basically decided to not get back to them on the vital "is he fit to fly?" issue for a few days (he had been OK to fly to Singapore after all), until after the surgery. There were a few days to go until the surgery, so that was enough time to get him back to the UK. Also, why hadn't he gotten in touch before? He texts pictures to friends, books flights, &c, and does not call the insurer or even ask somebody else to call on his behalf?

For what it's worth, I think he made the right decision in getting treatment in Singapore at his own expense, but that was still his decision.


His doctor said that he was not fit to fly back to the UK!


The incentive to say so is pretty clear: if he flies away, not only does the doctor lose a 15,000 costumer, he can also be liable if anything does go wrong.

Maybe he was indeed in no shape to fly, but he should have argued it properly in those days. Even now, it seems more like "it would have been awful to fly economy" and not "it would have been dangerous".


You may argue the doctor gave the opinion in bad faith, but that's pure speculation.

As a patient what would you do? You have been given a medical opinion that it is unsafe to fly. I definitely don't think anything the guy did was unreasonable.


I never claimed it was in bad faith and I don't think it was. Considering incentives and conflicts of interest is perfectly normal and does not mean that someone is operating in bad faith.

I just claim that it's reasonable for the company not to take it at face value as the doctor is not impartial and may have been considering a different standard for "not fit to fly." I certainly do not take it at face value. Could he have been put on a plane with extra medical assistance, for example? That may still have been cheaper than surgery.

I think what the guy did was the best choice if he can afford it, but I also think it's not unreasonable from the company to say that it was not what travel insurance covers.


As an insurance company what would you do?

On one hand, you have published medical standards saying that he is fit to fly. You also have your own doctors review the x-rays and find no reason that he can't fly.

On the other hand you have literally one sentence from an intern saying he isn't fit to fly.

It's a no brainer that you find him fit to fly and deny his claim.

>As a patient what would you do? You have been given a medical opinion that it is unsafe to fly. I definitely don't think anything the guy did was unreasonable

Personally, I go to my doctor and have him write a letter giving the specific reasons why I am unable to fly.


I only really comment on the economy seat because my cover included £2,000 of curtailment cover, but they weren't even giving me that. They were offering me a cheap seat out of goodwill. If I had accepted that, I was accepting that I had no claim.


>this is exactly what travel insurance is (which is why it's so cheap)

And that's one of the huge issues!! Very often people don't read their policies so they have misunderstandings about their coverage!! This is a big problem!

I always read all of my policies, I suggest everyone else does the same. I want to be very clear what losses are covered and what my deductible will be before any losses occur. In fact, for legal documents, insurance policies are pretty easy to understand.

I really wish he included the text of his policy.

While I have sympathy for him, I think he got himself into a pickle when he flew to Singapore than claimed he was unable to fly. Yes, Singapore is a million times closer to Indonesia than the UK, but he should have known if he left Indonesia the insurance company would have expected him to return to the UK.


To be sure, travel insurance is very often a very good idea.

But it's travel insurance not "whole-world, any hospital, health insurance" (which I am sure you can buy, but not at the same price).


I knew it may cause issues with my insurer, but the surgeon in Bali was on holiday for 1 or maybe 2 days and my finger was hanging off... Singapore was the closest place I could think of that would give me the best chance of saving it. And i'd do it again, in a heartbeat.


His flying from Bali to Singapore is totally understandable. He was in pain/shock and not advised by doctors to avoid flight.


The reply in the uber blog is less "angry tweet" than "good peer review": it points to a very specific methodological weakness which has bearing on the final conclusions.


For some things, we do. But databases are not magical and setting up a good table/index system &c is also work and there is overhead.

Thus, if we are talking about having (for example) a webservice where queries have a form that is known apriori, then it's a good solution. If you have output data from your processing that you will be slicing and dicing in different ways which you cannot predict ahead of time, then, they are not appropriate.

(Loading Terabytes of data into a database takes a while too).


You are commenting on the variant of the code that is fast enough that it doesn't matter.


While that may be true, my point is that it is almost certainly possible to make your code go faster than it is already, and also become more readable in the process.

And so saying that python is either slow or ugly and unreadable is perhaps an unfair characterization. I may be wrong here. I haven't benchmarked the code in question, but I think that even for the algorithm you're trying to do, with the special casing, that function could be significantly simplified.

Edit: I'd be curious to see example data that is passed into this function.


That may be the case. However, my point is that we started with a rather direct implementation of a formula in a paper. This was very easy to write but took hours on a test set (which we could extrapolate to taking weeks on real data!).

Then, I spent a few hours and ended up with that ugly code that now takes a few seconds (and is dominated by the whole analysis taking several minutes, so it would not be worth it even if you could potentially make this function take zero time).

Maybe with a few more hours, I could get both readability and speed, but that is not worth it (at this moment, at least).

*

The comment about the benchmark data being large is exactly my point: as datasets are growing faster than CPU speed, low-level performance matters more than it did a few years ago (at least if you are working, as I am, with these large data).


Right, and my point is that you could probably

1. Have gotten similar performance boosts elsewhere, meaning that you wouldn't have needed to refactor this function in the first place (although the implication of a 10000x speedup means that may not be true, although I can absolutely see the potential for 100x speedups in this code, depending on exactly what the input data is)

2. Its likely that there are much more natural ways to implement the function you have in pandas more idiomatically. These would be both clearer and likely equally fast, though possibly faster. (heck, there are even ways to refactor the code you have to make it look a lot like the direct from the paper impl)

In other words, this isn't (necessarily) a case of python having weak performance, its a case of unidiomatic python having weak performance. This is true in any language though. You can write unidiomatic code in any language, and more often than not it will be slower than a similar idiomatic method (repeatedly apply `foldl` in haskell). I'm not enough of an expert in pandas multi-level indexes to say that for certain, but I'd bet there are more efficient ways to do what you're doing from within pandas that look a lot less ugly and run similarly fast.

Granted, there's an argument to be made that the idiomatic way should be more obvious. But "uncommon pandas indexing tools should be more discoverable" is not the same as "python is unworkably slow".


1. No, that function was the bottleneck, by far, and I can tell you that >10,000x was what we got between the initial version and the final one.

2. I don't care about faster at this point. The function is fast enough. Maybe there is some magic incantation of pandas that will be readable and compute the same values, but I will believe it when I see it. What I thought was more idiomatic was much slower.

I think this is more of a case of "the problem does not fit numpy/pandas' structure (because of how the duplicated indices need to be handled), so you end up with ugly code."


1. you don't get 10000x speedups by changing languages. It's likely that this optimization would be necessary in any case.

2. You don't care about improving the code, but you did care enough to write an article saying that the language didn't fit your needs without actually doing the due diligence to check and see if the language fit your needs. That's the part that gets me.


"spending an hour figuring out what arcane incantation I need to pass to np.einsum to get the operation I want"

Yes, I have also had this experience and I hate how in the end, the code is very hard to read, while the for loop was probably trivial.


Every time I thought I needed einsum or similar arcane ops, I found that a Numba optimization for loop did the job.


Numba is nice, but it's another large dependency to pull in. If I can avoid it I will.


I think this is part of my argument: as datasets grow faster than single-core CPU speed, performance matters more and more.


Yep, my Haskell usage is "conduit all the way down".

I even wrote up a few utilities to make use of multiple threads while working at a high level: https://hackage.haskell.org/package/conduit-algorithms-0.0.7...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: