Here are my observations after doing clojure for a bit more than a year coming from doing js for two decades.
It’s fast, both clj and cljs but performance is non deterministic and as with most functional languages it can be hard to reason about performance. Profiling cljs is very hard
Coming from the js world I feel that tooling generally works. Especially when it comes to build systems. Very happy to not deal with webpack inconsistencies. With regards to ides there’s cursive for IntelliJ users, Calva for vscode.
I can’t comment that much on Java interop. Js interop is a mild inconvenience.
With regards to ecosystem, google ability is an issue but tempered by the simplicity of both clj and cljs.
About types, there are a class of bugs they will help to avoid. It does help with understanding intent of the code you work with. The drawback is that they help facilitate abstraction and does not help with reasoning that much about a program actually runs.
Yes cljs is a different language. But in practice it feels very much the same. I do however have a big issue with being able to push deterministic performance out of it. Keeping execution time for any frame below 16ms can be a challenge and for some type of front end stuff js is to be preferred
All of these things are quite insignificant on how fast a team can churn out good quality code using clojure. I’ve never seen any team that I’m with at the moment write correct, readable and fast code as quick.
It seems like some of the interpretations of quantum mechanics can be falsifiable with this method. Very Cool! In fact it looks like it would be possible to probe my favorite solution to the measurement problem:
https://en.wikipedia.org/wiki/Objective-collapse_theory
In the late 19th century people that went into science where actively discourage to study physics since many believed there was nothing left to find out. Without quantum mechanics we would not have the transistor, there would be no computers. The immense economic value delivered by fundamental research makes a 10 billion usd investment look like a microscopic peanut crumb compared to the potential value it can give. For comparison the combined world defense budget in 2018 was around 1.8 trillion usd.
It isn't that we wouldn't have transistors but that we wouldn't understand them. We wouldn't have the good transistors necessary for modern technology.
There is a similar debate around Newtonian physics. How far might the industrial revolution have gone without a perfect understanding of how things move? You don't need newton to build complex machines. Look at any early windmill or waterwheel. You need newton to understand machines well enough to build and run them efficiently. You don't always need theory before practical, but theory always makes practical better.
I found a Physics Stackexchange thread addressing this.[1] The science of quantum mechanics was not necessary for its invention. I’m still uncertain of its direct application other than “at <32nm QM effects are dominant”.
Hum... That thread goes a long way about diodes, but I couldn't find any such claim about transistors.
Yeah, somebody would probably invent them at some point. Somebody would probably even invent FETs, even if it took many decades more. MOSFETs are a completely different matter, I don't think anybody would invent them without understanding how they work.
What everybody seems to forget is modern chemistry, material engineering and medicine. The world would be a completely different place without quantum mechanics. Electronics is just a tiny part of the change.
MOSFETs are simpler than bipolar transistors. Putting it very crudely, MOSFETs are based on the same space charge principles as vacuum valves, except they happen to use doped silicon instead of free electrons.
Bipolar transistors are more complex, but they still work by controlling the bulk geometry of moving charge.
None of this requires QM, except in the very simple sense that you need to know what a bandgap is and how doping changes it. Beyond that, the "mechanical" details of the charge dynamics are just calculus - even more crudely, very very small plumbing.
QM effects are only relevant when the simple space charge models start to break down at very small geometries. Your "mechanical" space charge model becomes soft, noisy, and more complicated. Essentially the plumbing develops waves and ripples that depend on the geometry, and the edges of the pools and pipes can start to leak. And that's where QM finally starts to make a real difference.
So it really is where the particle phenomenon breaks down, thanks. Do you know if there are any kind of harmonic oscillator models used in this domain? I’m curious about a frequency-dependent characterization of the “energy” transport.
But if you can figure out diodes, you can figure out PN junctions. Well, a bipolar transistor is just two PN junctions. So the same claim should at least somewhat apply to transistors.
I agree and the enormous amount of knowledge build up is lost overtime when not actively used. Not investing in the next project is like no more space travel after we went to the moon. We should always push these projects further for the sake progress. Especially when you see it in the light of the defense budget.
No ,you are wrong. First of all we would still have vacuum tubes. Secondly, FETs (their physics is simple, and does not even need that much of quantum mechanics) were invented much earlier than BJTs (the quantum ones) and very possibly we would have been no more than 10-20 years behind from where we are now.
Difference is, the discovery of quantum mechanics did not require billion-dollar investment in equipment and manpower (not until work began on nuclear weapon, anyway). Besides, current research in high-energy physics, while being expensive, only brings scientific value and not much at all in the way of practical application (except, perhaps, as a side-effect of building more and more sophisticated equipment - but that would be like saying that wars are good for medicine, because obviously direct investment in medicine would be more efficient than investment in war).
I think its too dismissive to claim it will have 0 practical applications. Often this type of fundamental research take decades to result in something useful in an unexpected way. We can only go where the roads lead us, who are we to dismiss the roads not taken.
What is more likely to revolutionise physics - spending $20bn on an incremental increase in collider energies, on spending $20bn funding a new generation of PhDs and postdocs exploring quantum gravity fundamentals?
At this point practical HEP shows every signs of being a boondoggle. There are plenty of hard theoretical questions that haven't been answered, but with a few exceptions they're outside of the mainstream. Research into them has been actively discouraged, except at a few locations.
Physics doesn't need more hardware, it needs more ideas - more intellectual diversity, and more creativity.
I'm not sure just increasing funding to thep would generate new ideas, why would they not just funnel it into the same "safe" programmes as they have done the last 25 years? There are some small "fringe science" programmes here and there, but we probably shouldn't pour all money into those either..
The method itself, smashing ever more energetic particles into each other, makes it highly unlikely that any discovery will be applicable to human sized energy regimens.
Current high energy physics research brought us lots of big data advances, electronics advances, sensor advances, as well as the web itself. You dismiss too quickly the advances made when attacking large, hard problems.
>Wouldn’t direct investment in these areas more efficient?
Often not, because you don't know what areas to choose. Having a big, hard goal to solve usually requires building pieces to solve the problem, and these pieces end up being useful in themselves. Plus the big goal is often achieved.
Mankind could have done a lot of the stuff discovered along the way to big goals, but didn't, until the big goals were attempted.
Plus, it's often easier to get funding as a nation-state for big goals. Little side projects are often not individually worth chasing. The internet is a good example - it could have been done by industry before the govt decided to chase it as a piece of a big goal.
Yep - but that is not true for most high energy physics. The incredibly short lifetimes and low coupling of most new particles make them mostly unusable for anything we can forsee being useful.
Electrons were useful, because they have long lifetimes, are easy to make, and interact well. Higgs bosons are unlikely to be useful as technology ever since they are so ephermeral. And they're certainly not giving immediate benefits.
>> How can you be sure that the new science won't have any practical applications?
Nobody is sure about anything. Nobody denies that high-energy physics might one day bring great things. The problem is cost. Other areas of science are much more likely to bring great things with far less money. If tens of billions are to be spent on a bigger microscope, let it be one that will give us better solar panels, or a direct treatment for viruses. Polishing the details of the standard model can wait.
This is just burden-shifting, and if it is the best argument that can be made, there would be essentially no case for a bigger accelerator at this point.
Things are not quite that bad, but the case is weaker now than it has ever been before.
We haven’t seen any evidence of that in the many years that have passed. We have good science, the Higgs and all, but, frankly, there’s little hope that the energies comparable to those that existed at the beginning of the Universe would be of practical consequence to us (other than the destruction of the Earth).
If they do it’s nothing we’ll ever be able to harness
Why should society at this day and time then plow millions of man hours and raw materials into proving it literally?
Perhaps when nanotechnology or AI is able to be put to the task, those peoples can then build a very specific machine to prove very specific questions
But this is throwing spaghetti at the wall for what to us will be a moment of excitement then having no clue how to make use of it and a big mess to clean up
A very simplified version of Austrian economics boom and bust cycles would be that central banks provides money too cheaply which leads to non productive investments resulting in a first a boom and then a bust when those investments don't pan out. Worth noting is that empirical evidence for this is slim at best. I think a better explanation for the 2008 crisis is that commercial banks mismanaged risk of mortgage backed securities. And I have a hard time seeing the current crisis having anything to do with cheap money
The biggest bang for the buck for me has been using functional techniques where immutability is perhaps the most important thing. A good way to get started is to make a project with react/redux.
I agree that high inflation is really bad, but during the last decade there has been several rounds of QE without causing particularly high inflation. The recent QE package from the FED is about 2000USD per person in the US, perhaps some of it could be dished out as a helicopter drop instead.
I used to work in the online gambling industry for 15 years and there are a large number of use cases on a gambling front end where you do financial arithmetic.
I have seen equality between 2 floats being used as setting the end condition for a count up win animation. I don't think I have to explain what horrible thing that lead to.
I mean, technically you can do this in English as well. English is a Germanic language and compound words are fully legitimate in the language. The following don't exist in the colloquialisms or formal records of the language, but are grammatically correct:
The main difference in the languages is that modern English (especially 20th century on) has relied more on loanwords (another compound word) and hyphenations to augment. Love-worthy, for instance:
It's hard to count compound words as "words that don't exist in other languages", honestly. Meanwhile, there are perfectly acceptable (compound or not) words that actually give voice to something hard to describe in other languages. Which is a far more interesting (IMHO) concept; such as "schadenfreude" or, going the other way for many languages, "irony".
Interestingly here, "loanword" is a calque from the German "lehnwort", which reminds me of my favourite language fact: "loanword" is a calque, and "calque" is a loanword.
I am not convinced that Wehmut translates to sorecourage. The -mut ending appears in other emotions as well, Sanftmut, Schwermut for example. I think it relates to words like "Gemüt" and corresponds to "mood".
It's the same in Danish, and I'd imagine Norwegian too. And like Swedish, "Sinneswandel" is missing.
The Danish for "Ursprung" is »udspring«, which lit. means something more akin to 'out spring', but we do have the »ur-« prefix in Danish, meaning origin/source. So »urskov« would be the 'origin of forest', or jungle, i.e. it is the thing that it is the origin of all that.
In the same sense, you could - jokingly - making the word »urvittighed« meaning the 'original joke'. Compound nouns are extremely powerful.
> Interestingly enough the words exists in Swedish as well.
And this is no surprise. Low German was a major trade language of the Baltic region and Swedish absorbed an enormous number of Low German words either as direct borrowings or calques. Then due to the prestige of High German as a literary language in Northern Europe, Swedish calqued further words on German models.
That's not a direct translation as far as I can see. I'm not a native speaker but I don't think säkt does means anything, opposite to Schuld, which has a clear meaning. The direct translation would be something like avskulda, but that's not a word AFAICS.
I like Robert Fisk, he is hell of a solid correspondent but on this case he is absolutely wrong.
The Douma gas attack has been investigated not only by the OPCW, but by organisations such as the UN, the Red Cross and pretty much all serious journalism from the BBC, NYTimes to The Times and Wall Street Journal. They all agree that the Syrian Governmant gassed its own people i Douma.
The consensus is overwhelming, the wiki page on the incident lists 119 sources. I obviously haven't checked all of them but most look credible to me.
The consensus became less overwhelming since you commented this. See:
“Expert Panel Finds Gaping Plot Holes In OPCW Report On Alleged Syrian Chemical Attack” by Caitlin Johnstone https://link.medium.com/7qYYVU2Z40
Back in 2012, when ISIS beheading videos where showing up on my Facebook newsfeed every week, I recall it was Syrian rebels who were brandishing chemical weapons - but the MSM never mentions it: https://youtu.be/TpIRRRuCEyg
It’s fast, both clj and cljs but performance is non deterministic and as with most functional languages it can be hard to reason about performance. Profiling cljs is very hard
Coming from the js world I feel that tooling generally works. Especially when it comes to build systems. Very happy to not deal with webpack inconsistencies. With regards to ides there’s cursive for IntelliJ users, Calva for vscode.
I can’t comment that much on Java interop. Js interop is a mild inconvenience.
With regards to ecosystem, google ability is an issue but tempered by the simplicity of both clj and cljs.
About types, there are a class of bugs they will help to avoid. It does help with understanding intent of the code you work with. The drawback is that they help facilitate abstraction and does not help with reasoning that much about a program actually runs.
Yes cljs is a different language. But in practice it feels very much the same. I do however have a big issue with being able to push deterministic performance out of it. Keeping execution time for any frame below 16ms can be a challenge and for some type of front end stuff js is to be preferred
All of these things are quite insignificant on how fast a team can churn out good quality code using clojure. I’ve never seen any team that I’m with at the moment write correct, readable and fast code as quick.