Introduction to Stochastics

Overview


Variability is the enemy of semiconductor manufacturing.  Variations in the structures being made across the chip, across the wafer, and wafer-to-wafer can reduce performance, yield, and reliability of the chips.  Historically these variations have been “global,” with systematic process errors caused by such things as wafer flatness or hot plate uniformity occurring over length scales of millimeters.  Poor yield near the edge of the wafer is one common outcome.

But as the feature sizes of semiconductor devices continue to shrink in the latest nodes, a new type of variability has emerged that is negatively affecting device yield, reliability and performance – stochastics.  Stochastics are the random variations in the patterning process that are inherent as dimensions approach the atomic level. Unlike global variability, stochastics impact the “local” level, where patterned features that are nearby each other can have significant differences in dimensions, which can impact yield and result in a variation of device performance.

In previous generations of devices, stochastic variability did not significantly impact device yield or performance. But in the latest generation nodes, this local random variability can now comprise more than 50% of certain types of manufacturing errors which directly impact the devices. Today, uncontrolled stochastic variability can cost manufacturers hundreds of millions of dollars per fab in lost yield and delayed ramps. These variations, once negligible, now determine the viability of advanced nodes like 2nm and beyond.

Therefore, it is now critical for device makers to optimize and control stochastics, and to do so requires a different set of tools that focus on the probabilistic nature of stochastics.

Stochastics are a growing yield problem and, at EUV, can be responsible for more than 50% of the total patterning error budget.

STOCHASTICS AS A PERCENTAGE OF FAB EPE ERROR BUDGET

Types of Stochastics Effects


In semiconductor manufacturing, there are four types of stochastics effects:

Line Edge or Line Width Roughness (LER/LWR) – The edges of a transistor or other critical feature are not smooth. This can affect gate leakage, wire resistance, chip power and reliability.

Figure 1 Line Edge Roughness creating local critical dimension variability

Local Critical Dimension Uniformity (LCDU) – Devices near each other have different critical dimensions. This can affect yield and chip speed.

Local Edge Placement Error (EPE) – edges sit in random positions, potentially causing shorts or open connections. This affects yield and reliability.

Figure 2 Edge Placement Error

Stochastic Defects – Chip features have bridges or breaks in lines, missing or merged contact holes. These defects affect yield and reliability.

Figure 3 Missing Contact Holes

Why are stochastics effects getting worse?


As an example of why stochastics are getting worse at the latest process nodes, let’s consider the lithography process. In the semiconductor lithography process, a scanner uses light to expose patterns in photoresist, and the unwanted sections are then etched away to create features of a certain size.

In older generation nodes where device features were relatively large, all the features printed near each other could be assumed to be the same size. This is because the random variability, and therefore the local variation, of the process was relatively small. For example, the stochastic variability of a 100nm feature size is usually only 2 to 3% percent of the feature size.

This small effect for larger feature sizes allowed device makers to mostly ignore the stochastic variability in the manufacturing process and still successfully ramp their manufacturing to high yields. To achieve this success, device makers have continued to rely on predictive process models, measurement tools that output mean measurement values, and design rules that all considered the light, resist, and the etch processes as consistent entities. This methodology is called deterministic modeling and has been employed by the semiconductor industry very successfully over several decades.

The industry has now changed.

Features at LER = 2 nm


Today many device makers use EUV scanners to print the smallest features in their devices. The number of photons from an EUV scanner that expose a volume of photoresist is 1/14th the number of photons from a 193nm scanner (all other things being equal). For EUV processes, two features near each other can be exposed by a significantly different number of photons, a phenomenon called photon shot noise. This results in features near each other having different dimensions, an effect that is measured as Local Critical Dimension Uniformity (LCDU).

To compensate for this effect, we could increase the dose of the EUV tool which would increase the number of photons in each unit area and reduce the stochastic variability. But increasing the scanner dose has a direct cost by reducing the throughput of the EUV scanner. Therefore, engineers need to determine the proper tradeoff.

As device feature dimensions have been reduced to the molecular and atomic levels, the relative size of the random variability is now 10% or more of the feature size and comprises more than half of the variability in the patterning process. And stochastics is not just an EUV phenomenon -- when deploying multiple patterning techniques with 193 immersion scanners, the stochastic variability is also a significant contributor to overall error rates.

At latest generation nodes, the features printed near each other can no longer be assumed to be the same dimensions -- there is now a need to accurately optimize and control stochastics.

Photon Shot Noise

Comparing the number of photons absorbed in a given volume for 193 nm light (left) versus 13.5 nm (EUV) light (right) at a constant exposure dose and resist absorption coefficient. Figure from John J. Biafore, et al., “Statistical simulation of photoresists at EUV and ArF”, Proc. SPIE 7273, Advances in Resist Materials and Processing Technology XXVI, 727343 (2009).

The need for different measurement and analysis methodologies


Stochastic variability is now a critical source of error in numerous manufacturing steps including mask printing, lithography, etch and deposition. To optimize and control these processes, you first need to be able to measure the stochastic effects with accuracy and precision. After all, you cannot control what you cannot measure.

Importantly, the accurate measurement of stochastics is extremely difficult since the measurement tools themselves (such as a CD-SEM) can introduce measurement errors that are as large as the effect being measured. Therefore, the industry needs specialized measurement and analysis technology that can remove the SEM noise to accurately report the stochastic errors. However, measuring and removing this measurement noise is not something traditional methods and tools perform sufficiently well.

Also, to analyze stochastics properly you must employ probabilistic methodologies which are very different from the deterministic methodologies used historically in the industry. Accurate decisions on the impact of stochastics cannot be made by using only the measurements of the mean values.

For example, probabilistic modeling requires accurate error bars to determine the probability of an event occurring. To make this determination, all stochastics measurements need to include accurate error bars that describe the uncertainty of the measurement. But determining accurate error bars for stochastics requires different statistical tools from what has been common in the industry.

When engineers and automated control systems have access to accurate stochastics measurements, they can make informed decisions for each layer in development, reduce the variability more quickly in ramp and control the process in production. In addition, stochastic errors directly impact the optimization of chip designs and the application of OPC. Therefore, there is now a need to use stochastics-aware OPC modeling in addition to stochastics-aware process control.

Traditional analysis tools in the semiconductor industry have focused on deterministic modeling, but to accurately optimize and control stochastics the industry needs a different set of measurement and analysis tools.

Summary

The latest generation nodes in semiconductor manufacturing have significant random variability, called stochastics, that needs to be optimized and controlled. This problem is getting worse with each new generation.

Stochastics have forced fabs to make a trade-off between yield and productivity. For example, by increasing the dose on EUV lithography scanners, fabs can reduce the effects of stochastics and increase yield. But there’s also a large cost: the significant reduction of the throughput of their process tools. When fabs accurately control stochastics, they both improve the productivity of their process tools and increase their yields.

The first step in controlling stochastics is using accurate and precise measurement technology. You can’t control what you can’t measure.

That’s where Fractilia comes in. Our industry-leading measurement and control technology has become the de facto standard for stochastics measurement. We help customers optimize and control their processes from R&D through ramp and production. On July 16, please read our whitepaper for more information on bridging the stochastics resolution gap.