I imagine the Hep C drugs are some of the more monetizable drugs though and will need to rake in more money to subsidize the development of niche drugs.
Those numbers are sign of incredible waste and wrong incentives.
The incentives in drug industry are towards developing patentable molecules and techniques, not for discovering new ways to cure things. There is too much overlapping development, not actual drug discoveries that provide net benefits for the society in exchange for patent monopoly. When some discovers a new chemical entity (NCE) and gets it approved, competitors spend massive amount of money developing new molecular entities (NME) that do the same thing with slightly different molecule.
Then those srug companies use evergreening and pay-for-delay that adds several billions annually.
Governments should look very critically to drug patents start paying only for actual discoveries.
But sometimes new chemicals do cure things. Or at the very least treat them well enough to make life worth living. I'd much rather exist in a world with AIDS medicine than a world where you just die, even if I'd rather have a cure.
I take different medicines that improve my quality of life. Without one I would never be able to realize one of my most valuable potentials. I could sit here and complain that we don't yet have the medical knowledge to indefinitely prolong life (which I do), but at the same time I have the humility to appreciate that being born even a few decades before in the long history of humanity and I would have had a much lesser life.
Don't get me wrong, I'd love for pharma to take a more efficient and satisfactory path, but I definitely recognize how life is better thanks to it.
What a terrible analysis. Pretty sure that number is just the "Selling, marketing and administrative expenses" line from the 10-K, which lumps all expenses outside of COGs, R&D and a few others.
That's not just marketing as people know it. It's all the other costs of running a company too.
Except drugs are different from other products. If a drug is actually a good idea for patients, with no profit motive behind it, it will sell itself. Doctors should be aware of the drugs the big drug manufacturers are producing, and prescribe accordingly based on need. If you need to hawk your drug by plastering TV ads all over the place, or by having salespeople push it to hospitals / etc., it's arguably not a necessary drug.
> If a drug is actually a good idea for patients, with no profit motive behind it, it will sell itself.
This is completely false.
There's a once-daily pill that prevents contracting HIV, yet most of the people who have the greatest risk of contracting HIV (Black gay men) aren't taking it. Why? According to mountains of peer-reviewed research, the predominant factor is that they don't know it exists. That is literally a marketing problem.
For patients who are aware of the drug, the most common reported reason for not taking it is that they can't find doctors who know it exists (or how the treatment regimen works). Again, this is literally a marketing problem.
In the case of this drug, there are additional barriers for many of these groups beyond just knowing of its existence, but that's the first and biggest one. It is 100% false to say that a drug that is a good idea for patients will "sell itself".
I can only think one or two examples of drugs that "sold themselves". The new HCV meds are a good example of a lot of pent up demand even prior to approval. That said, there were still physicians out there that were not aware of the approval, so marketing still had work to do.
> I can only think one or two examples of drugs that "sold themselves". The new HCV meds are a good example of a lot of pent up demand even prior to approval. That said, there were still physicians out there that were not aware of the approval, so marketing still had work to do.
Exactly - incredible demand for them, and even still Sovaldi didn't sell itself.
If a drug is actually a good idea for patients, with no profit motive behind it, it will sell itself.
This is entirely untrue.
The one thing I learned in the biotech space is that doctors range from those doing research and know all about the cutting edge drugs all the way to doctors who are surprised to learn about a new drug that was approved a year ago. And there are way more of the 2nd kind.
Changes to how conditions are treated actually happens really slowly. What often hinders adoptation of new drugs is information and that’s a key goal of marketing - actually letting doctors know something new is available.
And that's not to slam them. They often treat a ton of different diseases, and a new drug launch isn't the highest priority for them. They'll eventually learn about it, so marketing is an attempt to accelerate that.
Doctors don't just become magically aware of new drugs, it's not like every MD gets a newsletter a la "here's what shows are coming to Netflix". Treatments compete over the condition/disease just like any competitive market, even if a drug has a patent.
I assume they meant a non-promotional newsletter that isn’t from the manufacturer.
Those do exist, but not all doctors read them. However, those same doctors might spare 10 min to have a manufacturer rep stop by to answer questions.
I’ll also add that doctors aren’t dumb. They know anything from manufacturers is biased - they want to sell drugs. They usually rely on other physicians to give them info on real world experience with a drug.
They might mean best-case vs average-case vs worst-case. A worst-case hash table insertion might be O(n) (rehashing the table) and in the malicious case it might even be O(n^2) (all table entries collide in a single linked list).
mabbo probably means that the average insert time is O(1), but s/he's wrong and couldn't possibly be right. An O(1) average insertion time into a binary search (i.e., ordered) tree would mean O(n) to build the tree from n elements, which would mean an O(n) search ... no can do. And the paper clearly says that the O(1) restructuring is in addition to the O(lg n) time to find the insertion point ... which anyone familiar with BST algorithms would expect.
For me, it's easier to remember that aliases acts like a dictionary, so I can treat it like any other dictionary. With "unalias" I have to remember the command. As I rarely use unalias (a few times over a period of years), I typically have to search the web on how to remove an alias.
And if it really bothers you, I'm pretty sure it's trivial to define an unalias function in Xonsh that will act as you expect (without needing parentheses).
It's part of the beauty of Xonsh - many such commands you're used to can be brought into it by writing fairly simple Python code.
Grad school was one of my best decisions. I went to a great school that was what I call the Goldilocks size, big enough to have a great faculty, equipment, and decent funding but small enough so that collaboration was the norm and the crazy horror stories of maniacal hours and/or cutthroat competition were normally self induced. My PI was an incredibly good guy and still a close friend. I met my business partner and co-founder while working with him the lab and we're now building a company that expands on the work we did in grad school.
That being said, I saw plenty of people not having the experience I did. This was almost always because i) they didn't really like research and didn't know it until they were there or ii) they picked a PI (PI = professor/boss) that was a really bad match for their work style and personality. Finding a lab & PI that matches your personal expectations about the PhD I would say is more important than the research focus. Don't choose something you'll hate learning about but ultimately the PhD can be more about learning how to teach yourself than the skills you learn during research.
What would interest me much more is if y'all liked studying or university life before that? Because for me.. I originally started to study CS because I wanted to learn stuff (was already working as a programmer) and hadn't ruled out pursuing a PhD afterwards per se. But the longer I was at university (German Diploma, 13 semesters for me, 9 minimum) the more I couldn't wait to leave - so even the thought of staying there went away quite quickly, although I wasn't "in academia" per se. I'm still not sure if I'm just more on the practical and pragmatic side of problem-solving and less in research.
I didn’t mind being an undergrad. But that was largely because of my friends and being an adult away from home for the first time. My friends and I built robots in our dorm room with our own money because the university didn’t have a program and wasn’t willing to accommodate us in any way. I enjoyed learning in class but it was always so rushed and stressful - you were always working up to an exam, then making it past and preparing for the next one. There was never any time to breathe.
Grad school was far, far better. Completely different league.
This article is kind of whack... Yes modern superconducting materials are getting warmer and warmer. No we're not at room temp now. No there's not a clear path to simulating band gaps for extended structure materials with the quantum computing..
taking frontend framework advice from a website that is arguably uglier and more poorly laid out than the bulk of geocities pages is a tough pill to swallow.
Aside from the Jakob Nielsen-y vibe to the color scheme (intentional), it's not that much different than any other developer blog out there -- but if that's the angle you want to take to discredit what I am saying, fine.
Well, to be frank, I honestly don't know what to think. I've had people telling me it looks great, people telling me it looks awful -- it does look amazing to me: http://imgur.com/a/yMTz8
Having said that, I'll definitely work out a update that's readable to wider audience.