I always loved the easter eggs and art that some chips have included on the circuits, most of which has never been seen and will never be seen by anybody but the designers.
Which reminds me of a story from a retired Bell Labs, Murray Hill engineer who related how they would add useless circuits to microprocessors to see if these sections were duplicated or deleted by the old Soviet military engineers. They were duplicated.
While sulfuric or nitric acid is the best way to do this, you can also decap using a box cutter and some patience. I recently did this to figure out if an opamp I bought was fake:
I’m going to try and get setup to decap properly. But the for those more concerned about using the required acids, a box cutter can give some insight into what’s going on in many devices...
We occasionally used a regular old propane torch at a previous employer, mainly looking for counterfeit power FETs. It'll burn the bond wires and such off an IC, and certainly isn't going to result in great quality images but it's not too hard to get rid of the plastic without damaging the silicon beyond recognition.
I used to do this as part of R&D efforts at my employer 15 years back. It was pretty dirt cheap to get someone to decap a package like this. You could also get them to put under an electron(x-ray?) microscope and fly around the chip like an airplane simulation. Then they would use something called FIB to cut and draw wires to test out bug fixes. Usually the designers would keep a couple unused gates at periodic intervals to borrow for fixes.
Bear in mind that humans don't generally get involved with the lower levels of the chip design these days, the design is maintained at a fairly high level with gate placement and routing tending to be driven by genetic algorithms. (This is why when you look at the dies of new microcontrollers/processors they tend to look less ordered than dies from the 70s and 80s)
For chip design specifically, that's rather tricky. In my experience, it generally requires a graduate EE degree. However, there are a lot of affiliated roles if silicon interests you. With the rise of SoCs and other digitally controlled parts, there is a bigger demand for supplemental software for validation and configuration. You can also get into an Applications Engineering role generally with an undergrad EE or CE degree.
> What would be the career path for a job in chip design and manufacturing?
Very long and very lucky.
I looked into microelectronics when I was a teenager. In 2009, you needed to be at the top of your class just to get an internship with a fab or a fabless.
In general, there are more automation with each year in both design and fab side.
Companies have luxury of employing people with 20 year careers at prices of average software devs in California.
The only exception is, of course, China: their fabs offer 6 digit salaries for senior engineers, but work experience is famously bad.
All of them expect foreign specialists to come with some "magic trick solution" for their business refusing to work, and of course they get pissed off if told that there are no magic tricks in this business.
I diverged from a very similar career path due to layoffs spurred by my employer being acquired, the remaining HW design team told to promptly relocate to Bay Area, whereupon they were gradually weaned out by extremely large work loads. Those to made it were then told to promptly relocate to SE Asia. The layoff, in hindsight, seemed the most appealing option.
That's the market signalling low demand, and a dying field. Im so glad I didnt get that gig out of college at altera and got diverted into software by market forces.
Is EE actually the correct field to study for formal verification? I have been looking into this at one point and it seemed very much within the realm of computer scientists.
For instance, our EE study program does not include even basic computability theory, the formal logic/graph theory was pretty minimal and besides some K-maps also zero boolean function theory.
This compared with the CS program that contains multiple very in-depth courses exactly on those topics which seem pretty critical for formal verification of hardware.
You could go the computer engineering route. It's a nice combination of EE and CS classes that will give you electrical theory experience that a CS major will not get and more experience with digital/software design than an EE will get. Not every university offers it. There is usually a large focus on writing C and Verilog/VHDL in the higher up classes. If you can't pursue a computer engineering degree at your school, an EE degree with a heavy focus on digital design will get you close. If can, try to get approval to take CS classes to expose you to more FPGA things if your CS path does that. There are a bunch of free online MIT classes for this kind of stuff. One thing to help, go out and buy a cheap Xilinx or Altera FPGA to start messing around with stuff on your own. Learn how to write fast assembly for MIPS or 8051 and fast C for RISC-V or ARM processors.
MSCoE here. FPGAs have limited applicability due to their high cost per unit and relatively low clock speeds. They were all the rage ~15 years ago during my grad study, but then flattened out. Although, it does look like they're bouncing back, likely due to newfound popularity for mining, and cheaper/better product lines coming out. Cheap, embedded stuff based on MIPS, ARM, even mobile processors like NVidia Tegra, will probably enjoy an order of magnitude broader market footprint, however.
I work in aerospace, FPGAs are the dominant chip. Everything from basic microprocessors to complex SDRs all run on them. Xilinx is the big one but there are smaller shops making more application-specific chips.
The EE degree is only there to make you employable as an EE. You'll have to teach yourself formal methods on your own. That's why it's a way to stand out.
"You have to teach yourself on your own" is always a good idea, but it would be an immense waste of time not to use the classes that your educational institution provides. Double majoring exists specifically for this purpose.
Read lots of books, make simple electronic toys. Read some more books, hang on forums, visit some conferences. Read more, talk to specialists. Understand bleeding edge in some area. Find an idea there, open a YC startup in chips, gather a team and dive head on in CTO role.
or in regular light...
I recently decapped some old chips that were ceramic packages with a soldered on metal cap. A 486 DX2[0] and an old Analog to digital converter IC [1].
the bare silicon was beautiful once exposed..(Pics posted without filters, though huawei's photos are generally more saturated than I normally like)
[0]https://www.instagram.com/p/BudTdillc-K/
[1]https://www.instagram.com/p/BudTG3Ml2ki/
They are indeed enhanced, but about 2x, not 10x. At the time of the writing I was increasing saturation on all photos a bit in postprocessing. Now I am typically making photographs with RGB illumination which increases saturation optically (without noise), so only contrast/brightness range needs to be tuned.
Colors are mainly due to light interference caused by different glass thickness over the chip area.
If we can digitize the entire netlist (schematic of transistors) from the 6502 processor by analyzing this type of photograph, why hasn't someone started analyzing the crappy wifi chips that people who don't run Trisquel use--the kind that require proprietary drivers to work correctly, in order to write free drivers for them?
As an aside, I find chips that have been covered in epoxy really annoying. Not only are they really difficult to remove, but even before that they don’t have the typical markings on their casings that serve to identify them and tell you whether they’re interesting to pursue…
Very interesting technique. I remember when they would reverse engineer chips by grinding them down while taking nice pictures of each layer for the engineers to interpret later.
At a prior job last decade involving FCBGA packaging design for the company's ASIC, and I witnessed exactly this to debug ESD problems, etc. That is, prototype dice failing test would be ground down until a black spot was found. Very expensive design cycles, so lots of incentive to extensively model such issues before fab.
Here are some examples:
http://smithsonianchips.si.edu/chipfun/graff.htm
https://en.wikipedia.org/wiki/Chip_art
My favorite has always been Wally/Waldo from Where's Wally.