Correct me if I'm wrong but The article doesn't describe how 'classical' cryptography will survive quantum computers. It merely describes how it will break RSA and then lightly covers a potential alternative quantum proof public private key cryptography scheme.
This is a good point. I found the article's content disappointing, to be honest. I think it hit a bit of an uncanny valley - it was too shallow to be comprehensive for a technical math/computer science audience, but too deep (and emphasizing the wrong things) to cover the ground in a way that would be appropriate for a non-technical audience.
If I were to write an article like this, I would probably choose a more explicit audience from the outset, then cover either a depth-first or breadth-first approach to the subject. A breadth-first approach would be good for a non-technical audience: here are the general types of cryptosystems, here are the ones threatened by quantum computers, here are the ones that are not, here are the current proposals for post-quantum resistant cryptosystems.
On the other hand, were I writing for a technical audience I would assume an understanding of why quantum computers threaten classical cryptography (and why e.g. symmetric encryption is mostly safe), then take a deeper look at each of the post-quantum proposals.
This will ensure all outgoing traffic is headed for the Tor network. This would prevent this vuln from being effective as it would drop the outbound packets.
This analogy is comparable to an extent. The odds of making money off bitcoin compared to a lottery ticket are several orders of magnitude higher. When making investments if you want to make large gains some of your money has to be in riskier assets.
Obviously don't put your life savings in bitcoin, anyone who does is not making an intelligent choice. But if you have money to spare that your willing to lose, I wouldn't consider it a horrible investment.
This also happens with fiat currency in the form of lost or destroyed cash. The final supply of bitcoins is limited but bitcoins can still be created using fractional reserve banking no different then our current banking system.
Your claim about fractional reserve banking creating more Bitcoins is wrong. The banking system doesn't create central bank money it creates a claim to it.
Fractional reserve banking oesn't create coins or notes, it creates credit (or bank money). These credit (IOUs) are liabilities of the bank that are denominated in central bank money (central bank liabilities). When a bank makes a loan this does not in any way affect the central banks balance sheet -there are no changes in the liability side of its balance sheet.
When the loan is withdrawn, this still won't cause a change in the liabilities of the central bank. Other than what bank owns what liability at the central bank. (But even this isn't necessary. It depends on what type of payment the customer made with the loan and what payment system it was cleared through: if it went through a gross settlement system or a multilateral net system,etc.).
The supply of central bank money is dependent on how central banks implement monetary policy. Everywhere in the world (and what has probably always been the case) central banks will supply as much of central bank money as is demanded from the banking system.
Fractional reserve banking will when the bank credit is denominated in Bitcoin doesn't create more Bitcoin, it just creates more claims in Bitcoin. The supply of Bitcoin is independent from this.
Watched this the other day, good layman's introduction to what computers actually do. Feynman has a serious skill at explaining complex topics in simple terms that anyone can understand.
love to watch Feynman's lectures even if I have a fairly good understanding of the topic, such an entertaining lecturer.
Self referential loops in the brain may indeed be the mechanism for our sense of self awareness but to go on and say this makes consciousness any less mysterious is a non-sequitur.
How would any combination of neurons and interconnections between the neurons (feedback loops or not) explain the emergence of a subjective first person experience i.e. consciousness
This might be a confusion of words and not reality. If you define consciousness as self awareness then we are talking about two different things.
My point is exactly that a subjective first-person experience is self awareness. The root of all conscious experience is an awareness of own awareness, a sense of our own senses, the difference between them, etc. I would challenge anyone to come up with a definition of conscious experience that does not, ultimately, boil down to a form of self-referential sensory input.
Bitcoin is a neutral technology, think of it like cash. Buying illegal things is always done with cash but it doesn't mean we should get rid of cash altogether.
Regardless even if we came to the collective decision that we wanted to get rid of bitcoin, its not feasible due to its decentralized nature.
I used to agree with the "neutral technology" line of reasoning, however I think my view has changed. Everything has an orientation to it, enabling or strengthening certain dynamics, but not others. These characteristics are not static, as they depend on the broader context, and can change rapidly and unpredictably sometimes -- yet they can be quite important and should be considered.
I would argue the concept of "perfect neutrality" is a non-sequitur. When someone says something is very neutral, it seems to me they are actually noticing that something either has near universal acceptance in the current mind-share or is simply non-consequential such that no one really cares one way or another.
It reminds me "inherent value" (the general philosophical concept, not the financial term with very specific meaning), which a lot of thinkers find to be a misguided concept.
That's not to say we should ban bitcoin. And even if we wanted to, as you said, attempting to do so would be a rather absurd endeavor.
This seems true at face value. Consider the cutting edge technology aka as a knife, with an inherent bias for cutting things.
Dinner time. Killing time. ("Food is murder"?)
It is the context of utility of technology that is the determining factor. A technology, imo, can be deemed directly culpable of ill effects IFF it permits no other utility context other than that which results in morally or ethically unaceptable outcomes.
Ever nuclear weapons can be used for good, you know. (Extinguish fires, for example.)
> it doesn't mean we should get rid of cash altogether
I can't remember the last time I saw physical cash. The only ones I know who are still using cash are drug dealers. Not saying it should be banned but it's almost gone in my country already.
No idea. The convenience is more important to me personally than the risk of being screwed over but since everyone is using it I presume there are people putting pressure back.
I think soon that this sentiment will start to apply to Universities. It seems inevitable at some point in the near future there will be an online 'university' (for lack of a better word) who's graduates will be considered equal or even better than a standard university education, particularly for tech related degrees.
Universities have been a centralized source of accreditation for a long time. All it takes is for someone to figure out how to restrict graduation and filter good candidates using testing or some other means to gain accreditation and acceptance of its graduates by industry.
That's exactly why a lot of bootcamps have come into existence, along with a guaranteed job in the industry at the end of it. Though many employers hire university grads as a sort of "signal" for people who can work hard, think critically, finish what they started etc.
And I'm not saying one is better then the other, just noticing this trend of bootcamps popping up everywhere to replace university CS/CE education.
anecdotally most people I know who have gone through bootcamp have had major issues getting hired and the ones that did were hired into support/saleseng rather than software engineering.
I've seen the same as well, more so that most of them were taught one structured way to look at problems and only how to use specific tools, rather than why. There's definitely tradecraft learning necessary beyond a bootcamp.
It already has for me, I won't hold it against someone but it no longer indicates any basic level of knowledge.
This is based on my last batch of interns that had masters degrees but couldn't handle hello world. Their spoken english skills made it clear that they were completely incapable of understanding the lecturers.
Universities are a business, they are paid a lot to provide a piece of paper, so they provide it.
Western Governors University is a choice in the US, too. I am currently attending, and it's different from any other college I have gone too. All of the classes are competency based and self-paced, meaning theoretically you can get a Bachelor's degree in 6 months.
Misleading title?