Hacker Newsnew | past | comments | ask | show | jobs | submit | drham's commentslogin

I use colemak as well. I find it way more rare than you might think that you have to type in QWERTY. There are probably couple of times a year I need to, and I can always get away with some quick hunt n' peck in those rare cases.


Among many other reasons referenced by others, the users who would be willing to fork over a subscription fee to access twitter are exactly the demographic that twitter needs in order to keep its advertising attractive to marketers. A successful subscription option for a service like twitter would greatly diminish the value of their advertising inventory.


Faster, probably, but 2x+ Faster like in PowerPC > Intel very unlikely. Which is why there’s much less likely to be enough performance budget for effective emulation.


Presumably their "desktop-grade" A-whatever chip will have a higher TDP and clock speed than their mobile counterparts.


If they could just turn up the TDP and clock speed on their chips and get 2x the performance of Intel's best chips that easily- they would already own the entire desktop market.


And some of us, with actual CPU design backgrounds, have been saying that for a while now.

We've been asking why Apple doesn't already do it.


As someone without a CPU design backround, would something like this scale pretty linearly with added power and thermal headroom? I assume there are limit that would have to be overcome, but what would an ARM chip in the conditions of an i7 look like?


TDP typically scales as somewhere between the cube and fourth power of the clock speed if you're pushing the envelope (in the sense of running at frequencies where further frequency increase also needs a voltage increase). So having 10x the thermal envelope means you can probably clock about twice as fast, all else being equal.


It's more of a logarithmic scale.

The same architecture can generally scale to 10x over a few process generations.


This does not hold at 10nm and below.


Seems that it’s more of a logistical and network-effect issue in getting everyone to support it, rather than a technical issue.

Possibly a patent issue also


My only question is if they actually use ARM for their next-generation architecture, or something completely new...


Computer architectures routinely see 3x performance jumps across different power budgets. This rule has held over decades.

Clock speeds alone can probably increase by 30%. Caches and internal datapaths can double or more. Then you can start to add in more execution units or more expensive branch-prediciton or even new more power-hungry instructions.

A 4 Watt Intel Pentium 4410Y Kaby Lake for mobile devices gets about 1800 on Geekbench, while a 115 Watt Intel Core i7-7700K Kaby Lake for desktops gets 5600.

I'm just going to say it: the Apple laptop CPU is going to get Geekbench score... above 9000!

And, yes, I do have a CPU design background.


So artificial benchmarks already do a very poor job of capturing performance. The apple laptop cpu does not exist. If it did exist it would likely suffer a very substantial performance hit if forced to emulate x86 software. So you are going to speculate on the meaningless benchmark numbers of an imaginary cpu that will take a wholly unknown hit if everyone does not rewrite everything why?


Artificial benchmarks do a great job of capturing performance, since they're more controlled and eliminate unnecessary variables.

Once you understand this, then you can understand how CPU designers work to predict future performance. CPU designers use artificial testbenches.


You making up numbers doesn't appear to be a useful endeavor.


I suspect if Apple designs a desktop CPU, performant x86 emulation will be a key design criteria. I know very little about CPU design, but I imagine it would be possible to have hardware optimisations for x86 emulation just like we have today for video codecs.

Or even further they could bake a "rosetta" into the chip's microcode and have their CPU natively support the x86 instruction set along with ARM or whatever they come up with.


Which is the previous gen and was sandbagged as the 50% increase in cores for coffee lake show.


If someone disagrees they should state why


Do you think they will use the same chip? In 2020?


Ok I'm all for a good app store rejection story but this sounds pretty suspect: “any app designed to help people use their phones less is unacceptable for distribution in the App Store.”

It's way more likely because there isn't a public API for changing the application icon on iOS (at least until 10.3 https://developer.apple.com/reference/uikit/uiapplication/28...)

...and even when once there is a public API. It is certainly against app store rules to infringe on the trademarks of popular social network companies for what you use as your icon.


"Up to $650". You plug in your exact model and get a value thats <= $650 based on resale value, similar to value to just selling to a service like Gazelle. Smart marketing though.


Most of the time when people have negative response to something they refrain from leaving an internet comment, thats why everyone loves internet comments /s


My resolution isn't a specific project, but more of a general commitment to spending regular time on side projects.

I've very recently been enjoying using pomodoro[1] cycles to track my productivity at work and I'd like to translate the same behavior to my side projects and commit to getting 4 cycles done per week (~2 hours), and tracking that information to keep myself accountable.

[1] https://en.wikipedia.org/wiki/Pomodoro_Technique


I'm pretty naive on this topic, but wouldn't adding an artificial delay only create more opportunities for arbitrage vs other exchanges?


91% could also pretty misleading because not all vulnerabilities are equal. It's easy to let 9 potential segfaults or memory corruption issues get disclosed if you get to hold on to the 1 iOS Zero Day/Shellshock type attack/etc...


You beat me to it, haha. I was going to make the point that the vast majority of bugs found don't do anything significant for a hacker. A program crash or corruption at worst. It wouldn't surprise me if NSA just discloses the ones that hurt availability while weaponizing the few hitting confidentiality or integrity.


Yep. Think of it as creaming off the top 9%.


Mildly interesting (at least in chrome on os x) if you open a new tab on top of it the web audio api stops playing but doesn't seem to stop the timer in the animation leading to a jump and some interesting audio for a beat when you open the tab back up.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: