I use colemak as well. I find it way more rare than you might think that you have to type in QWERTY. There are probably couple of times a year I need to, and I can always get away with some quick hunt n' peck in those rare cases.
Among many other reasons referenced by others, the users who would be willing to fork over a subscription fee to access twitter are exactly the demographic that twitter needs in order to keep its advertising attractive to marketers. A successful subscription option for a service like twitter would greatly diminish the value of their advertising inventory.
Faster, probably, but 2x+ Faster like in PowerPC > Intel very unlikely. Which is why there’s much less likely to be enough performance budget for effective emulation.
If they could just turn up the TDP and clock speed on their chips and get 2x the performance of Intel's best chips that easily- they would already own the entire desktop market.
As someone without a CPU design backround, would something like this scale pretty linearly with added power and thermal headroom? I assume there are limit that would have to be overcome, but what would an ARM chip in the conditions of an i7 look like?
TDP typically scales as somewhere between the cube and fourth power of the clock speed if you're pushing the envelope (in the sense of running at frequencies where further frequency increase also needs a voltage increase). So having 10x the thermal envelope means you can probably clock about twice as fast, all else being equal.
Computer architectures routinely see 3x performance jumps across different power budgets. This rule has held over decades.
Clock speeds alone can probably increase by 30%. Caches and internal datapaths can double or more. Then you can start to add in more execution units or more expensive branch-prediciton or even new more power-hungry instructions.
A 4 Watt Intel Pentium 4410Y Kaby Lake for mobile devices gets about 1800 on Geekbench, while a 115 Watt Intel Core i7-7700K Kaby Lake for desktops gets 5600.
I'm just going to say it: the Apple laptop CPU is going to get Geekbench score... above 9000!
So artificial benchmarks already do a very poor job of capturing performance. The apple laptop cpu does not exist. If it did exist it would likely suffer a very substantial performance hit if forced to emulate x86 software. So you are going to speculate on the meaningless benchmark numbers of an imaginary cpu that will take a wholly unknown hit if everyone does not rewrite everything why?
I suspect if Apple designs a desktop CPU, performant x86 emulation will be a key design criteria. I know very little about CPU design, but I imagine it would be possible to have hardware optimisations for x86 emulation just like we have today for video codecs.
Or even further they could bake a "rosetta" into the chip's microcode and have their CPU natively support the x86 instruction set along with ARM or whatever they come up with.
Ok I'm all for a good app store rejection story but this sounds pretty suspect: “any app designed to help people use their phones less is unacceptable for distribution in the App Store.”
...and even when once there is a public API. It is certainly against app store rules to infringe on the trademarks of popular social network companies for what you use as your icon.
"Up to $650". You plug in your exact model and get a value thats <= $650 based on resale value, similar to value to just selling to a service like Gazelle. Smart marketing though.
Most of the time when people have negative response to something they refrain from leaving an internet comment, thats why everyone loves internet comments /s
My resolution isn't a specific project, but more of a general commitment to spending regular time on side projects.
I've very recently been enjoying using pomodoro[1] cycles to track my productivity at work and I'd like to translate the same behavior to my side projects and commit to getting 4 cycles done per week (~2 hours), and tracking that information to keep myself accountable.
91% could also pretty misleading because not all vulnerabilities are equal. It's easy to let 9 potential segfaults or memory corruption issues get disclosed if you get to hold on to the 1 iOS Zero Day/Shellshock type attack/etc...
You beat me to it, haha. I was going to make the point that the vast majority of bugs found don't do anything significant for a hacker. A program crash or corruption at worst. It wouldn't surprise me if NSA just discloses the ones that hurt availability while weaponizing the few hitting confidentiality or integrity.
Mildly interesting (at least in chrome on os x) if you open a new tab on top of it the web audio api stops playing but doesn't seem to stop the timer in the animation leading to a jump and some interesting audio for a beat when you open the tab back up.