Hacker Newsnew | past | comments | ask | show | jobs | submit | kayaeb's commentslogin

I'm not positive that's true. I think the most challenging thing for MOND is that basically no one will touch it unless they already have tenure because it's a death sentence.

I saw Pavel Kroupa (big name in Milgromian gravity) present in Heidelberg (big concentration of Astro), at the time Volker Springel (author of widely used LCDM simulation code "GADGET") was there and Illustris simulation sets (LCDM major project) had just been rolled out. And Pavel basically got heckled (in a very erudite and respectable way, but constant interruptions from the LCDM majority audience).

But Pavel had one slide, I can't find it now, but it was like 72 different problems that LCDM had not solved (ok, 72 is an exaggeration; you can troll his website for mentions of a lot of them https://astro.uni-bonn.de/~pavel/kroupa_SciLogs.html).

And like, Pavel's a big boy, he has some of the most cited papers in all of astronomy, he's got tenure, and he's set, so he can take it and not care. But grad students / postdocs I imagine would constantly have their work politely ignored and get shunted into underfunded groups.

I'm just trying to say that LCDM has things it can't explain, and MOND has things it can't explain, but the amount of resources in each theory is seriously lopsided so LCDM can frequently "tweak" itself to solve problems that MOND just doesn't have time or resources to to the same (for example disk formation in LCDM models used to be impossible until they had the supercomputing resources for the resolution required, and they found that the feedback coefficient could a) not promote disk growth, b) promote disk growth, and c) destroy disk growth, depending on how much they cranked it up. That's NOT a triumph of LCDM making an amazing replication of the observation, that's some grad student in a lab with enough CPU to tweak a meta-parameter until it looks good.

Also the CMB is extremely tightly constrained... and multiple huge tightly-constraining studies, WMAP, PLANCK, Gaia, are more than 3-sigma outside of each other's results, so... perhaps too tightly constrained.

Edit: Found it, here is the great astronomical bloodbath of 2014: LCDM (Springel and Rix) vs MOND (Kroupa). Great watch. https://www.youtube.com/watch?v=UPVGDXNSBZM


My experience from MOND talks is that a proponent of MOND gets up, explains their theory, and then starts explaining how their theory explains the rotation curve of galaxies.

They're then confronted with a bunch of really basic questions from the audience: how does your theory explain the spatial spectrum of anisotropies in the Cosmic Microwave Background? Is your theory consistent with Big Bang Nucleosynthesis? Can your theory explain Weak Lensing measurements? How does your theory deal with the Bullet Cluster? The answer is then generally, "I'm not sure, but I'm working on it." That causes all the astrophysicists in the room to lose interest. Standard cosmology explains all of these basic observations with a minimal set of assumptions. If your theory can't or doesn't explain the most basic set of observations, and there's another theory that does, why should I care about your theory?


This reminds me of the old joke: When an academic says "That's a good question!", you should hear "I don't know the answer". Likewise, "I'm working on it." means "No.".


Imagine what proponents to heliocentrism such as Galileo had to face explaining their theory and how it explain a simplified elliptic orbit for the planets rather than the strange curly orbits known by geocentrists.

They'd be confronted with basic question from the audience: Why would things fall down if it was not the center? (That was before Newton) Wouldn't we see paralax in the stars (stars are much further away than what was believed at the time). Wouldn't we feel it if earth turns so fast? "Maybe that explains the tides" said Galileo (but it doesn't)

The previous model was also explaining the observations quite well at the time. Why should we care for another theory? (God works in mysterious ways.)

I'm not saying that MOND is correct. ("they also laughed at Bozo the Clown".) Just that the fact that there are some unexplained missing piece does not mean one should reject it so quickly.


Comparing oneself to Galileo is generally viewed as a sign of crackpotery, especially if it's used as a response to legitimate criticism.

> Just that the fact that there are some unexplained missing piece does not mean one should reject it so quickly.

It's not just one missing piece. It's a whole series of basic properties of the observed universe. Most MOND theories are tailored to match one particular observation, but fail to match everything else. Until there's a MOND theory that matches a basic set of observations (like CMB anisotropies and the large-scale structure of the Universe, the ratios of abundances of the light elements, weak lensing measurements, etc.), MOND is simply uninteresting to most astrophysicists.


> Imagine what proponents to heliocentrism such as Galileo had to face explaining their theory and how it explain a simplified elliptic orbit for the planets rather than the strange curly orbits known by geocentrists.

They did. In fact, the audience was quite more brutal. And it did not only delay the progress of physics, but also destroy the local research, leading to the entire community being rebuilt on England.

But, well, as you said, we always have to remember they also laughed about Bozo the Clown.


It's also worth pointing out that LCDM is a cosmological framework. MOND (typically) has no cosmology.

And then even if MOND is correct - you likely still need something like DM to explain clusters (particularly the bullet cluster). It has also been included in the paper posted by OP as scalar field at early times, which then washes out at late times.

It is true though, MOND is hard to touch as an early career researcher!


I'm not entirely sure which one of the two, Dark Matter or MOND, is conceptually the bigger disaster. Dark matter pretty much sounds like the invisible ether that was suspected to carry EM waves. A physical theory that depends on the scale at which you look at space seems similarly awkward to me.


You could have said the same stuff about the neutrino after Pauli introduced it in 1930 to explain the missing pieces in the beta decay. The alternative could have been giving up on the conservation of energy-momentum, but then if you have a continuous energy spectrum in an apparent two-body decay, it makes sense to think there's a third body there that makes it all work as expected despite you being unable to detect it at the time.

It took 26 years to confirm, and that's thanks to having man-made high flux sources. Detection of solar neutrinos had to wait until the 60s. Funnily there was a puzzle with those, as around two thirds of the ones you could expect seemed to not being there (again). This mismatch took another 40 years or so to confirm, so now we know that there are neutrinos indeed and that they show flavour oscillation, that's why if your experiment is looking for a particular leptonic flavour, well you're missing the other two.

So this is not the first time such there must be something there I can't see, yet I can say something about so it all fits together does the job. Hopefully it won't be the last time.


Very nice breakdown of the history of the neutrino! I fully agree. Whether a theory is a conceptual desaster or not is completely unrelated to its success. And I totally get that the hope of physicists is that such theories continue to be successful, since it means new physics which can be discovered. But, still if there is a theory that manages to explain observations, without postulating an otherwise unmeasurable quantity, Ockham’s razor tells you that it would be reasonable to prefer those.


The main difference is that the aether was only ever a theoretical construct, whereas we have lots of indirect observational evidence that "dark matter" is a phenomenon that really exists. We don't know what it consists of, but it certainly seems like there is more of it in some places than others, and it has mass and momentum (see e.g. the Bullet Cluster that was already mentioned in this thread), so "matter" seems as good a name for it as any.


Ok, I would be curious about evidence that dark matter has momentum, because my favorite theory at the moment is that spacetime itself has certain topology on large scales, which isn’t tied to any masses. If one could show that dark matter has momentum, I would reconsider, but I don’t think that is what Bullet Cluster shows.

Edit: even if it turns out that dark matter is the best theory, it would still be a conceptual disaster. But so is QM. Nature seems not to care what we find conceptually appealing.


Quantum mechanics is incredibly elegant and appealing as a theory. It's just counter-intuitive.


Is this why there are a bazillion of theories of the foundations of QM? I get that the mathematics is incredibly elegant, but concepts behind the mathematics are not really “understandable” to quote Feynman.


The Many-Worlds Interpretation is the simplest one, because it makes no assumptions beyond, "The basic rules of Quantum Mechanics are correct." It's also very counter-intuitive, but very elegant.


Is it though?

Renewables that are popular now, like solar and wind, are weather dependent, which means the grid needs to handle variable input, which it's not designed to do and would need major fixing. Also means you need to store the energy for when it's not abundant, Germany does this by pumping water into mountain lakes to produce hydro on demand, but they've run out of lakes, so you need batteries. Batteries of a grid-size magnitude need carbon-expensive materials and have their own plethora of problems.

These two problems are only feasibly conquerable in a reasonable timescale by a few countries, for others they are prohibitively expensive, far beyond the small subsidies proposed in things like the Paris Accord. Now, sure, USA, Europe could implement these, but it's not going to fix the problem if Asia, Africa don't, and additionally it's going to add significant economic and geopolitical pressure between these nation-blocs, those who are hamstringing themselves and those who aren't.

Hydro has serious geopolitical issues as fresh water supplies grow more tactically necessary, consider Egypt / Ethiopia right now with the Nile, or China's tibetan plateau snow-seed cannons to capture water before it reaches India.

Tidal has extremely short lifespans, any moving system in salt water is difficult to keep going for more than a few years.

Geothermal is good, but also one of the most expensive and a bit of a "slow burner," not going to be powering anything serious with that without sinking billions into the plant.

The only real solution I see is nuclear, but that's a naughty word for some reason.


As near future sci-fi options go from 100ft up I don't see anything too problematic with hydrogen economy. Except for transport infrastructure.

Or, we remain with carbon based fuels, except they are synthesized from atmospheric carbon in regions with plenty of solar energy.

These approaches surely have challenges, but I would like to know what are the biggest ones?


Ammonia is more practical than hydrogen. Just not as sexy. Should be rebranded as the "hydrogen-nitrogen economy."


Ammonia has issues like being toxic and higlhy volatile.

I think DME has some potential as an alternative fuel because it can be produced through CO2-to-DME hydrogenation or by using syngas/manure and burned in cleaner diesels (no particulate emissions).


> Renewables that are popular now, like solar and wind, are weather dependent, which means the grid needs to handle variable input, which it's not designed to do and would need major fixing.

Non-renewable plants have variable output even if it is not dependent on weather, and the grid in fact deals with variations in total and per-plant inputs whether or not solar or wind is attached. If there are problems with that, the need to address them is independent of the use of solar and wind.


Not on the scale of "we need to move the bulk of the energy from Texas to New York for 3 hours, then when the sun sets shift state-level power supply routing from the cornbelt to cover the drop."

Handling a plant going offline is totally different from the logistics of shuttling variable regions of production across the entire country.


> Not on the scale of "we need to move the bulk of the energy from Texas to New York for 3 hours, then when the sun sets shift state-level power supply routing from the cornbelt to cover the drop."

Sure, but that's why you favor a regionally balanced mix of renewables, with regional peaker plants (which may not be clean/renewable, but are better than relying on dirty sources for your core needs) not geographic concentration of single sources.


> The only real solution I see is nuclear, but that's a naughty word for some reason.

I am sure that if you put your mind to it you can think of some reasons people might feel uneasy about nuclear power. It’s not like incidents involving it are that far in the past.


if you look up deaths by energy production method, nuclear is far and away the safest, in every single country by orders of magnitude, even wind is deadlier.


The claims that solar and wind are deadlier use cooked statistics (very old numbers for wind, and dodgy assumptions about rooftop solar that don't apply the utility-scale fields that are now dominating).


I'm not sure if "accidents" count as "cooked" statistics... modern nuclear plants are much safer than 20th century ones as well, so if we're updating numbers I think nuclear is still an extremely safe contender (and if you split nuclear deaths by country, USA is an order of magnitude again safer than all nuclear). Maybe I misinterpret what you mean by cooked though, I would be interested in reading more about it, if you have a good piece? Wind also relies on pizoelectric elements, I'm not sure what their carbon footprint is compared to nuclear fuel, probably would be less in the future if like CA/OZ stepped into the rare-earth supply chain.

And new types, like thorium salt reactors don't have classical issues like uranium waste products and much much smaller possibility of runaway reactions. And France has been "5 years away" from fusion reactors for like 30 years on a shoestring budget, with real money that could be a possibility in maybe another 30 years and also dodges all the horror-story problems.

Overall, I agree, modern solar, especially reflector-steam plants shouldn't be lumped in with silicon panels, especially when talking about total carbon footprint. But I do think nuclear carries a lot of unfair baggage that has kept us on oil for way longer than we should have been.


Right now yes. But we're creating waste that remains harmful for thousands of years and store it in facilities with a 100-year service life.

It's the same kind of "leaving problems to the next generation" of current fossil-fuel use except this one is a lot more generations away.


This isn't the case with thorium salt reactors or, if we had enough funding to finally crack it, fusion.


Renewables are crony capitalism. Without consistent government subsidies, incentives, grants, etc...they are not profitable.


No, renewables are public goods as they have positive net externalities, that is, a substantial share of the benefits of their use accrues to people outside the transaction. Well, relative to fossil fuels, at least; it's more accurate to say that fossil fuels are public harms with a substantial net negative externality.

Government action in the form of subsidizing the former and/or taxing the latter is necessary to internalize the externalities so that the relative net cost (benefit) is born by (accrues to) the participants in the transaction.

https://en.m.wikipedia.org/wiki/Pigovian_tax


Fossil fues are subsidized because the users don't pay the true cost.


Yeah they are also directly subsidized.


OCD isn't double-checking your line-spacing in a document, it's being unable to leave the house in under 45 minutes because you check the stove twenty times, then the door lock twenty, then turn around 5 minutes down the street to check again, then make a deal with yourself that you'll check the stove 3 times in a row and then not allow yourself to do it again, only you do it again anyways and finally take a picture of it so you can discretely check the photo on your phone when you're out on your date that you showed up half an hour late to.


This really rubbed me the wrong way, the closing sentence sums it up: "What if a doctoral program’s prestige arose, in part, from the way that it treated its students? We should dare to dream of such a thing."

There is a huge problem in academia where the number of PhDs is growing exponentially, and the number of positions is staying basically flat. I strongly disagree with this "participation award" mentality of "you deserve it." It's... leading a lot of people to make really stupid career decisions and I think a lot of people are doing it just to get this gilded club to bludgeon people who don't have PhDs -- that's the only reason I can think of for a person to hope for an "easy" PhD experience.

My advisor being rough at the right times (and in good spirit) was absolutely essential to the defense, which is essentially gladiatorial combat against your father (well, mother in my case), when both you and your advisor hope you can defeat the master in your specific realm. I would not have been able to do it if my advisor pranced about making everything pleasant and easy and convenient.


protip: if your prof is tenured and still first author on more than 2 papers a year, they're siphoning their student's work, or they're setting you up for pipelining positions (which are not horrible, but, be aware that's what's up).

I've never had a prof insert my work into their own papers, they've hooked me up to contribute to other teams, but with us it was always "my" project.

If you get a chance to see them present at a conference (lots of these are on video online nowadays), check if they specifically mention their students in the presentation, that's a green flag.


Good observation.

Writing as second author was a very positive factor very early on. I was an assistant to a research fellow from my second year at uni, mostly because of coding skills. Pretty soon they encouraged me to write a pragraph or two about the implementation details, and I got added to the author list. I felt that was appropriate and a great kickstart to a research career (got my first MIT press journal credit before my bachelor)

I agree however that professors growth-hacking their publication lists with student labor is problematic indeed and looking at first/second author creds is a good way to flag such behavior.


Really depends on your mentor. I've seen some gross abuses where the union was absolutely essential.

Also the union hooked up like BBQs and whatever so like, that's nice.


What bargaining power do you bring in this situation?

Give us what we want or... we'll quit our degrees and throw away years of our lives?


>Give us what we want or... we'll quit our degrees and throw away years of our lives?

Give us what we want or you have no teachers and your research output drops to near-zero. Most of the actual work of scientific research is done by what we politely euphemize as "early career researchers" or "junior authors": grad-students and postdocs. Tenured and tenure-track professors don't do the bulk of the bench-work, even when they really enjoy it.


But if they don't do any research they will kill their own careers as well.


Kinda like how when you're a worker, not doing any work kills your career. Every strike is a risk.


I disagree a bit about freedom and flexibility in switching faculty, and I disagree strongly about the phrasing of "willing" to work with x faculty on y problem.

I don't care how smart a student is, they are in no way prepared to select a project that will lead to interesting findings, which is a must if you plan to continue in any capacity. A good mentor will have a project with proprietary data that has a 95% chance of successful publication. In this way the student won't get scooped (not through any fault of their own, it's just that a PhD student, even a good one, just can't compete speedwise with someone like me who already has a code library built up to do complex analyses, this is why proprietary data is essential, or an extremely good prof who has an idea that they are confident no one else is on). My projects were a mix, I had one with proprietary data, and one which was a legit eureka finding. But as you say, most profs just aren't good enough to do these Eurekas (not an insult, 90% of science is gruntwork), so they need the cloistered data playground that the child student can crash around in until they manage to build their horrible little sandcastle. That data depends on funding, which depends on a proposal, which depends on a promise to do project X on data Y, and the prof is going to get reamed next time they apply for funding if student Z decided it wasn't "fun" and went to some other prof.

Caveat: We had one turkish guy who was straight up getting academically abused by his advisor (like literally timing his lunch breaks, sending spies to make sure he attended class, super weird shit), and thank the heavens there was a judicial oversight in place for him to get a new placement.

Caveat the second: I thrive in a low-input environment, so my prof just put me in the sandbox and I built the most magnificent and beautiful sandcastle the world has ever seen and roared my way to a whopping like seven citations (lol). My prof was also the director though and the low-input environment wasn't like optional, and not everyone did well in this low-input environment.


FAST can also move it's "mirror," the shell has some huge number of triangular sections on wires, so it should be able to point even more than arecibo which has an immutable reflector. Both have adjustable focal points (sensor head) on wires like you describe.


FAST is actually having difficulty hiring (on-site specialists like operations director) the smartest minds because it's in a jungle in small-town China.

But yes, overall China is executing an excellent "brain drain" (not meant in a negative sense) by being willing to invest in these "keystone" projects when other countries are tightening budgets.


Telescopes generally have this problem though. Because it sucks to live out in the middle of nowhere and you typically want to bring your family (which means you need cities to support things like schools and recreation. Most labs/research sites I know are like this. China Lake is an exception. But even Groom Lake has has Vegas near it).


Not sure why you're being downvoted.

This is correct to an extent, the surface of a mirror needs to be polished to a degree such that abnormalities in the lens are small relative to the wavelength being observed (I think it's the diffraction limit equation, not positive atm). Radio telescopes (with cm wavelengths) require much less precision than optical ones (with sub-micron wavelengths). This telescope, FAST, actually is made out of a collection of triangular (I think) sheets arranged into a kind of dome, if it were observing optical it would act like a disco-ball instead of a parabolic mirror. Arecibo is literally a hole in the ground with rocks and crap on the reflector.

This thing actually had "first light" observations with several holes in it from triangular sections that were fritzing just kind of fluttering in the breeze. The first radio telescopes were literally built out of post-coldwar trash just kind of rigged up in the back-yard. There's a major array going in in south Africa where the antennas are like christmas tree wires that are just kind of hammered into the ground, one of the PIs showed video at a colloquium I attended of his 10 year old kids setting them up. If anything in modern science can be called "low specification" it's radio telescopes.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: