Hacker Newsnew | past | comments | ask | show | jobs | submit | throwaway4590's commentslogin

Whenever I see talk about Intel's FPGA unit, I link back to an invention I submitted to Intel while I was an intern there [0]. I went through the patent pipeline, but to my knowledge they never did anything with it. This was during the excitement of Intel's original acquisition of Altera.

In fairness, I never mocked up a true enough implementation in Verilog to get an idea of real world speedup, and even now, I'm not sure exactly what operations you could see real gain with from small reconfigurable fabrics near the CPU. Still, I liked the elegance of having L1-L3+ FPGA's for speeding up operations of increasing levels of complexity, and I figured programmers smarter than me would find creative ways of using the FPGA's with the added instructions.

[0] https://patents.google.com/patent/US10310868B2/


Thanks for sharing. Small question about Image 20, does that represent a use case for an instruction translator? For example you have an arm chip and you want to run x86 code so you offload the x86 instructions to the fpga?


I believe my contributions start at Image 25 on Google. Images 1-24 are generic CPU boilerplate images that the lawyers add to most patents in the field.


Isn't it silly to link a throwaway account to a patent you invented? :)


A bit unorthodox to have a public throwaway and an anonymous-ish main account, I suppose ;)


I mean. If you, the original author, didn't even bother implementing and testing the idea why would you expect someone else would? I personally wouldn't touch such a project with a thousand foot pole.


Having worked in both software and hardware, I understand and appreciate this sentiment in software.

But most software engineers don't understand the amount of time and compute that would be required to mock up a novel, yet sufficiently complex as to be realistic, CPU. I did indeed have a basic HDL implementation, as is required for most patents in the US (reduction to practice). But to implement it fully enough to understand what kind of performance changes you could get in a modern CPU... it's safe to estimate it as an order of magnitude harder than the most complex pure software project you've ever built alone, and well beyond the resources of an intern who's just doing this in his spare time. Software is a great gig, because it's easy and you can do it all on your laptop.

And I'm sure computer engineers like myself don't appreciate the difficulty of manufacturing true mechanical hardware at companies like Tesla, where a team and a budget would be even more essential to build anything useful.

Anyway, the technical committee thought it was worth patenting, and the idea itself is pretty digestible without it working on production silicon, but I have no idea who comes up with product roadmaps at Intel. It was probably buried in the patent graveyard, or maybe someone qualified actually looked at it and realized it wouldn't work.


I'm actually a hardware engineer. I disagree that such a thing is only important in Software. You can easily validate any specific idea using cycle accurate simulators. If your intership didn't have enough time then fine. I would expect you continue working on the apparently golden idea your sitting. Nowadays you have a ton of pretty good softcores to choose from to test out an idea.

If the original author of a patent didn't find it worthwhile to pursue the idea for whatever reason I have very little faith in the idea to begin with. Taken to the extreme it's like saying I can turn lead into gold, and then never actually showing you actually making money that way.


> I'm actually a hardware engineer. <...> You can easily validate any specific idea using cycle accurate simulators

If anything, I'm less convinced than before that you understand how complex a modern CPU is. Perhaps you can re-read my comment; I essentially did what you are suggesting, but that wasn't enough, in my opinion, to glean any information about benchmarks in the real world. It took hundreds of hours. To make it slightly more realistic but still woefully inaccurate using licensable IP's and my own time/money without the company's buy-in seems absurd, even to the most stubborn HN commenter. The fact that we are modifying the L1/L2/L3 structure (perhaps you also didn't read the patent before commenting) makes this essentially re-designing large parts of the CPU from the ground up, at least as far as I understand licensable core IP.

> I would expect you continue working on the apparently golden idea your sitting

Why? No one ever said it was the golden idea. Only that I found it interesting. Apologies if my optimistic curiosity triggered your defenses.

Regardless, your statement makes no sense for other reasons. Perhaps there's a cultural gap (are you not American?). Here are some relevant points about US companies and patents I hope I can help you understand:

1. Big US corporations will patent anything novel, technical, and remotely related to the business as a means of protecting themselves against litigation. Most of these patents only see an initial technical committee, lawyers, and then are never seen again.

2. Inventors have zero individual rights to their patents when patented via company channels (in exchange, the company pays for the lawyers and usually grants a tiny stipend).

3. Interns typically do not influence how research budgets are allocated.


They have the patents to do even better than that and create hierarchies of miniature programmable fabric, similar to the concept of L1-L4 caches except for FPGA designs [0]. Unfortunately, they don't have the organizational willpower to do true innovation though. From what I have heard, significant portions of the designs for their processor lines are not understood by any current or recent employees. There is a tremendous amount of legacy "code" and fear of changing things that may break backwards compatibility. There is no vision at the top, their process lead is gone, and the architecture team is patching decades of bad security without simplifying designs.

It seems like they are just riding out their market share for as long as they can, which could be a while. Intel has a really strong brand.

[0] https://patents.google.com/patent/US10310868B2/


There needs to be a marrying of software and hardware -- and processor instructions to support that. This is hierarchical reconfigurable cache-based architecture I designed a while ago at Intel; I doubt they have touched it but they do own it.

https://www.google.com/patents/US20170153892


Just a thought - I suppose you wanted to post with a throwaway account, but claiming to be the inventor of the patent gives your identity away


Sure, but this way his main account is not linked to his name.


Otoh, he could have just written "I've found this patent" and nobody would have known he was the author.


But no one would know the author was referencing his own patent and thus knowledgable about the subject if one wanted to conversate on it


It's still terrible opsec. Of course, as always, it depends on the threat model. Still, linkage and contamination are deadly - why give up more information than absolutely necessary for your adversar[y|ies] to use against you in the future. All it takes is one mistake or a few 'almost mistakes'.


What does it have to do with opsec?


Transmeta code-morphing? VLIW?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: