It sounds like asking CS PhDs to do a world record speed run. I wouldn't be surprised if the people best suited to the task aren't the type to get onto "a vetted list".
This has nothing to do with USB-C, this is the minimum design voltage of your lithium ion battery pack. In this case, you have a 4-cell pack, and if the cells drop below 2.895V that means they're physically f*cked and HP would like to sell you a new battery. (Sometimes that can be fixed by trickle charging, depending on how badly f*cked the battery is.)
If your laptop's USB-C circuitry were built for it, you could charge it from 5V. (Slowly, of course.) It's not even that much of a stretch given laptops are built with "NVDC"¹ power systems, and any charger input goes into a buck-boost voltage regulator anyway.
What you're seeing are the speeds of various multi-tier caches (RAM, intermediate SLC etc.) It cannot write to its main flash memory that fast. While it to the user looks like they just wrote 10 GiB in a single second, the SSD is internally still busy for another 10 seconds persisting that data. The actual real write speed of top-shelf consumer grade SSDs these days is somewhere in the vicinity of 1.5 GiB/s. Most models top out at half of that or less.
I bought this one when upgrading my desktop, it indeed delivers what it promises. 14.5GB/s on my tiny random desktop, it's impressive. Everything feels so instantaneous, my Linux desktop finally feels like a Mac :)
That's certainly impossible as even USB4 is only 40Gb/s~5GB/s, and of that you could only expect to get 32Gb/s~4GB/s. Or realistically even less due to overhead.
It is probably the speed of it being read into RAM.
Try entering sync right after copying to see how long it really takes
I have one of these, though I'm using with a USB 3.x port as that's what my desktop has. For me it's working fine, and for others with actual USB 4 ports it seems to be working properly for them.
Because TBB has javascript on by default, turning it off increases your signature. It would be better if TBB defaulted to js off, with a front panel button to turn it on.
JS also dramatically improves security. TBB is stuck in a 90s mindset about privacy, as if Firefox exploits were not dime a dozen. Especially with AI making FF exploits more available, we can expect many tor sites to be actively attacking their visitors.
Tor endpoints are pretty easy to identify, there are plenty of handy databases for that, using it to begin with increases your uniqueness. If noscript was set to strictly disallow javascript by default, that decreases the degree to which it increases your signature relative to the baseline of using tor.
Then we have to account for the simple fact that many, many fingerprinting techniques rely on javascript, so taking them out of the picture reduces the unique identity that can be gleaned.
Are we absolutely, positively sure that the tradeoff is worth it? Without a strict repeatable measurement, I think I'm highly skeptical about whether or not a default of "allow" is a net boon to hiding your identity. I remember the rationale about the switch mostly being directed towards "most of the web is broken otherwise and that's bad."
Every server knows that you're using tor, we're only talking about whether they can match your traffic to you repeatably, and particularly across sessions, which then enables traffic analysis that can lead to complete deanonymisation.
If TBB changed to js off by default that signal would be less evident, and also, fingerprinting would be harder.
Disabling JavaScript actually greatly increases your fingerprint as not many users turn it off, so that instantly puts you in a much smaller bucket that you need to be unique in. Yes, not having JS means it limits your options for gathering other details, but it also requires much less effort to be unique now without JS.
Tor Browser also doesn't spoof navigator.platform at all for some reason, so sites can still see when you use Linux, even if the User-Agent is spoofing Windows.
> increases your fingerprint as not many users turn it off
We're talking about users of the Tor browser, and I'd be very surprised if this was the case (that a majority keep JS turned on)
Basically every Tor guide (heh) tells you to turn it off because it's a huge vector for all types of attacks. Most onion sites have captcha systems that work without JS too which would indicate that they expect a majority to have it disabled.
> Disabling JavaScript actually greatly increases your fingerprint as not many users turn it off, so that instantly puts you in a much smaller bucket that you need to be unique in.
I've heard a handful of people say this but are there examples of what I would imagine would have to be server-side fingerprinting and the granularity? Since most fingerprinting I'm aware of is client-side, running via JS. While I expect server-side checks to be limited to things like which resources haven't be loaded by a particular user and anything else normally available via server logs either way, which could limit the pool but I wonder how effective in terms of tracking uniqueness across sites.
In addition to server-side bits like IP address, request headers and TLS/TCP fingerprints, there are some client-side things you can do such as with media queries, either via CSS styles or elements that support them directly like <picture>. You can get things like the installed fonts, screen size/type or platform/browser-specific identifiers.
I have my problems with that argument. Yes, less identifying bits means a smaller bucket but for the trackers, it also means more uncertainty, doesnt it? So when just a few others without JS join your bucket eg. via a VPN, profiling should become harder.
reply