FailureAnalysisBlog.com

Failure Analysis Blog

This blog is for writers that want to share articles and conversation on failure analysis topics.  We welcome Guest Bloggers, see the About link above for more info on guest blogging for us.  We welcome articles on topics of interest related to failure analysis.  Example topics include&nbs electronics and semiconductor failure analysis, integrated circuits (IC) and printed circuit board (PCB) failure analysis, materials failure analysis and any other engineering related topic.   If you'd like to post your thoughts, please register.  You can begin writing immediately.

Electronic packages undergo extensive failure analysis processing when they are found to be defective. The results of these findings can echo all the way back to the design stage of the integrated circuit itself. The meticulousness and care with which failure analysis procedures are carried out are reflected in the extremely low failure rates of the electronic devices we use every day. Most of the time when one of our electronics malfunctions, it is due to a mechanical failure even though the complexities are thousands of times greater in the heart of the chip.

All failure analysis procedures invariably end with prying open the chip in one way or another to obtain visual confirmation of the error. This procedure is known as decapsulation or decapping and is accomplished through a variety of means. While failure analysis techniques generally fall into destructive and nondestructive testing, the end result is almost always the same. Without being able to see the defect for yourself, it's impossible to be sure of the cause. While techniques like acoustic microscopy and emission microscopy can provide pointers to what went wrong, decapsulation is extremely important in order to draw any firm conclusions about the sample.

Electronic circuits come in all shapes and sizes and degrees of sophistication. They can range from being part of a cluster of supercomputers to being present in your air-conditioner remote control. In all uses great and small, they play a key role in enabling the functionality we take for granted every day. Fundamental to all uses, is the concept of the integrated circuit. What used to take several rooms to accomplish can now be done on a microscopic scale. It’s hardly an exaggeration to put the development of the integrated circuit on the same platform as the steam engine. Over the years, they have become smaller and smaller while at the same time packing ever higher densities. This has enabled miniaturization on a scale never before imagined.

An integrated circuit is susceptible to multiple points of failure. Each step along the process of design, manufacture, transportation and storage introduces possibilities for error. Whether we are talking about excessive humidity, dust, detachment of the die, or electrical failure, each malfunction requires extensive testing in order to determine the cause. Failure analysis engineers subject a defective electronic package to an extensive battery of tests to ascertain the cause of failure. In order to do this, they have to ensure that the chip is not substantially altered between tests so that the results of subsequent ones are not skewed. This leads to the importance of what is known as Nondestructive Testing or NDT. These procedures preserve the chip for further analysis.

As we've seen before, fluorescent imaging is an important tool in the analysis of integrated circuits. It is a nondestructive procedure that maintains the integrity of the chip and provides us with valuable information about its composition and the different substances present in it. Other innovative uses have involved stimulated emission depletion that attempts to increase the resolution of the image by using lasers to light up the center of a fluorescent spot. The physics of fluorescence is easy to understand. It relies on the spontaneous emission of photons as electrons in higher orbitals revert back to their normal state. Of course, not all materials exhibit proper fluorescent properties.

The technology involved in microscopy continued to evolve as the science of quantum mechanics progressed. During the time when we were bumping up against the limits of optical microscopy due to the wavelength of visible light, the notion of electrons having a de Broglie wavelength hadn’t been developed. As a result, scientists were using workarounds such as ultraviolet light to improve the resolution of images. When it became apparent however that particles such as electrons could also behave like waves act the sub atomic scale, these phenomena were quickly adapted for use in microscopy. As a result, scanning electron microscopy techniques as well as related procedures such as Transmission Electron Microscopy or TEM were developed.

EMMI or Emission Microscopy analysis is the process of detecting electromagnetic emissions from electronic circuits that are malfunctioning. Every object emits light – whether it is electronic or not. Electronic circuits by virtue of operating at a higher temperature than normal, give off more than usual. If you use a laptop, just try placing your hand outside the fan vent and see how hot it gets. Electronic circuits are always designed with heatsinks in mind to absorb excess infrared radiation. While heat is a nuisance, it can also give us valuable information about the inner workings of an integrated circuit. Many things can go wrong necessitating failure analysis. Emission microscopy provides an easy and noninvasive way of detecting certain types of errors.

Failure analysis engineers are never satisfied in their quest to obtain more precise measurements of a given sample of an integrated circuit. As we have seen before, the scanning electron microscope represents a tremendous improvement over traditional optical-based instruments due to working around the limitations of the wavelength of visible light itself. Other procedures such as field emission microscopy and emission depletion serve as additional methods to obtain close-up images. One other technique is known as Atomic Force Microscopy and which represents a major advance in imaging allowing us to obtain resolutions of up to fractions of a nanometer. Apart from this, AFM has several advantages over a traditional electron microscope.

With all of the various failure analysis processes such as infrared thermography, scanning electron microscopy, field emission microscopy and acoustics, it's important to develop a formalized process for isolating the root cause of failure. So far we have examined all of these various technologies in detail. In today's article we look at the overall flowchart of the failure analysis process and its goal in improving the manufacturing, transportation, and storage processes that ultimately led to the failure in the first place. This requires us to carefully select issues that have a chance to be resolved by root cause analysis. The end result is a Pareto ranking of the various defects leading to a structured approach.
We use a bewildering variety of electronic components every day. Most of these depend critically upon the smooth operation of integrated circuits in one form or the other. Whether we're talking about our powerful personal computers or the humble grinder in our kitchen, ICs are at the heart of their operations.

What's truly amazing is the resilience of these devices. Most of the time, the weakest link in their lifespan is the breakdown of some physical component. The actual chips themselves rarely if ever malfunction which is quite a miracle when you think of the millions of electronic packages churned out by factories every year.

Failure analysis of defective chips is at the heart of this low rate of error. A plethora of techniques are used to probe and isolate flaws - the lessons from which are traced right back to their origin, sometimes to the design process itself!

The choice of failure analysis technique depends on the type of the fault we suspect in the electronic package. Each procedure specializes in the detection of a unique category of flaw and a process that fits the unveiling of one might be completely unsuitable for another. For example, failure analysis of electronic defects is most easily and efficiently taken care of by infrared thermography techniques due to the heat signature of various types of malfunctions. Similarly, those problems that can be verified visually by a close examination of surface features are best uncovered by a scanning electronic microscope.

What do you do however, when you suspect a structural defect buried deep within the chip that may be too tiny to meet the eye? For sure, we have procedures like X-Ray analysis, but this doesn't give the detailed level of information that we're looking for. For this kind of flaw, we need to use acoustics.

Sound waves are unique in that they utilize the substance of the chip itself to propagate unlike say a heat wave when we use Lock-in thermography. We also possess the capability to modulate the frequency of sound waves to a high degree of accuracy. This means that any variations in the material easily show up using the right type of acoustic equipment.

This also has the advantage of maintaining the integrity of the chip without destroying it. As a Non Destructive Technique (NDT), acoustic based failure analysis is an indispensable too in the arsenal of engineers looking to obtain the best confirmation of a flaw before they commit themselves to a destructive procedure that will render the chip useless for future testing.

One type of defect that lends itself to detection via acoustics is voids. These are tiny inconsistencies in the semi conducting material that resemble "holes" or gaps deep within the integrated circuit. Due to the extreme precision required for a chip to function smoothly, these voids cause all kinds of malfunctions. If they're tiny, they don't show up using x-ray scans and we need to use acoustics to pin them down.

Another type of defect frequently found in chips is delamination which is the improper binding of a die to the material. Microscopic cracks are another flaw that is optimally detected using sound waves.

Like other waves, sound is reflected when it hits any kind of change in the propagation material. By measuring these reflections, we can create a "map" of the area under observation to a high degree of accuracy. It's a more sophisticated version of sonar which is used in applications as diverse as shipping to getting pictures of unborn babies!

Due to the extreme precision required, the experiment needs to be set up very carefully. Air itself is a pretty poor conductor of sound waves at the frequencies we need, not to mention it has quite a few impurities floating around in it like dust. Liquids do a much better job which is why our hearing is greatly enhanced when we're underwater. The chip needs to be submerged in a liquid or alcohol for the sound waves to propagate optimally. This somewhat limits the application of acoustic techniques for failure analysis to those items that will not suffer irreversible damage by such contact.

Generating the sound wave at the desired frequency can be easily accomplished by inducing a vibration based on input alternating current. Since we have a great deal of control over the latter, we have no problems obtaining the high sound frequencies needed.

Acoustic techniques are just one example of the various kinds of innovative procedures used by failure analysis engineers to detect faulty chips and improve the performance of the various kinds of electronic circuits we use every day. It's in large part due to their unstinting efforts that we have the high degree of confidence in our electronic components that we demonstrate every day.

More Articles...

Page 1 of 8

Start
Prev
1