Continue to Site

Semiconductors

A semiconductor is a material with electrical conductivity between that of a conductor and an insulator. They are used in a wide range of electronic devices, including transistors, diodes and integrated circuits (ICs). The most commonly used semiconductor material is silicon.

Semiconductors Content

Reasserting U.S. leadership in microelectronics, semiconductors

MIT researchers lay out a strategy for how universities can help the U.S. regain its former status as a major power in semiconductor and microchip production.

The global semiconductor shortage has grabbed headlines and caused a cascade of production bottlenecks that have driven up prices on all sorts of consumer goods, from refrigerators to SUVs. The chip shortage has thrown into sharp relief the critical role semiconductors play in many aspects of everyday life.

Years before the pandemic-induced shortage took hold, the United States was already facing a growing chip crisis. Its longstanding dominance in microelectronics innovation and manufacturing has been eroding over the past several decades in the face of stepped-up international competition. Now, reasserting U.S. leadership in microelectronics has become a priority for both industry and government, not just for economic reasons but also as a matter of national security.

A group of MIT researchers argue in a white paper the country’s strategy for reasserting its place as a semiconductor superpower must heavily involve universities, which are uniquely positioned to pioneer new technology and train a highly skilled workforce. Their report, “Reasserting U.S. Leadership in Microelectronics,” lays out a series of recommendations for how universities can play a leading role in the national effort to reattain global preeminence in semiconductor research and manufacturing.

“In this national quest to regain leadership in microelectronics manufacturing, it was clear to us that universities should play a major role. We wanted to think from scratch about how universities can best contribute to this important effort,” said Jesús del Alamo, the Donner Professor in MIT’s Department of Electrical Engineering and Computer Science (EECS) and the leading author of the white paper. “Our goal is that, when these national programs are constructed, they are built in a well-balanced way, taking advantage of the tremendous resources and talent that American universities can bring to bear.”

Losing leadership

The invention of semiconductor technology by U.S. scientists led to the birth of Silicon Valley in the 1950s, which helped the U.S. became the dominant force in semiconductor research and manufacturing, but that dominance has been slipping for decades. Only 12% of semiconductor chips are produced in the U.S. today, down from 37% in 1990, according to the Semiconductor Industry Association.

One driver of that domestic decline is the massive infrastructure investments countries like South Korea, Taiwan and China have made over the past few years. Those investments have boosted their domestic microchip companies and even enticed some U.S. firms to open fabrication facilities overseas, del Alamo explains.

A chip manufacturing plant, also known as a fab, might cost as much as $10 billion, so companies make a big economic bet when they decide to build a new facility. Any economic incentives governments can provide, in the form of tax advantages, cheap land, and even outright subsidies, play a role in a firm’s decision about where to site a fab.

A 2020 report from the Semiconductor Industry Association asserts that, when economic incentives are taken into account, manufacturers face a 30% cost disadvantage when producing microchips in the U.S. versus Asia.

U.S. policymakers are working to close that gap, in part, with the CHIPS Act, legislation that would provide $52 billion in federal investments for domestic semiconductor research, design, and manufacturing. Congress is also considering another piece of legislation, the FABS Act, which would establish a semiconductor investment tax credit.

Growing the workforce

As the authors point out in the white paper, economic incentives are only part of the picture.

Reasserting leadership in semiconductor manufacturing will also require thousands of new highly skilled workers, and universities contribute a sizable fraction of the workforce for the industry. Expanding the size and diversity of this workforce will be key, but educational institutions face an uphill battle as more students abandon “hard tech” for fields like computer science. Attracting more students will require exciting hands-on lab courses, inspiring research experiences, well-crafted internships, and support from industry mentors, as well as fellowships at all levels, among many other initiatives.

“We are already in a situation where we are not producing enough engineers at all levels for the semiconductor industry, and we are talking about a major expansion. So, it just doesn’t add up,” del Alamo said. “If we want to provide the workforce for this major expansion, we need to engage more students. The only way, in the short term, to provide many more graduates for this industry is expanding existing programs and engaging institutions that have not been involved in the past.”

This image shows the CMOS THz-ID chip. The chip is a collaboration between Profs. Ruonan Han and Anantha P. Chandrakasan. Courtesy: Massachusetts Institute of Technology (MIT) This image shows the CMOS THz-ID chip. The chip is a collaboration between Profs. Ruonan Han and Anantha P. Chandrakasan. Courtesy: Massachusetts Institute of Technology (MIT)

Enabling innovation

Universities have also played a major historic role in contributing fundamental research, and the nation will need to rely on academic labs to generate new innovations.

Many universities have aging infrastructure that is fast approaching obsolescence, if it isn’t outdated already. The authors argue the U.S. needs to invest in university infrastructure — both capital equipment and people to operate it and support research and educational activities. A major upgrade for research facilities is essential for universities to remain relevant to industry and its state-of-the-art tools.

“It is not just about making transistors smaller. Future progress requires new materials, new processes, reimagined devices, and novel integrated systems,” says Vladimir Bulović, the Fariborz Maseeh Professor of Emerging Technology and founding director of MIT.nano. “Technologies that we will rely on a decade from now might look nothing like the ones of today. Academic innovations are bound to disrupt the present technical roadmaps and leapfrog the performance of presently imagined systems. Maintaining a strong link between today’s industry and academia will ensure that our best ideas can enhance the present industry and launch new technical ventures.”

Startups also play a vital role in innovation, and universities have long been a hotbed of entrepreneurial activity.

For this to continue, the authors argue that universities need strong partnerships with prototyping facilities, national labs, and commercial foundries to help enterprising researchers spin their innovations out into tech startups that will become the world-class corporations of the future.

Collaborations with Lincoln Laboratory, a federally funded research institute located in Lexington, Massachusetts, that is managed by MIT, has enabled microchip innovations that wouldn’t be possible otherwise, del Alamo said.

“MIT’s combination of a world-class innovation engine with a capability to prototype complex microelectronics at Lincoln Laboratory is unique and powerful,” said Bob Atkins, division head of the Advanced Technology Division at Lincoln Laboratory. “The combination supports both discovery and maturation of disruptive microelectronics technology, and permits translating ideas into practical realization. It has produced a long history of impactful developments ranging from specialized imagers to microelectronics lithography employed worldwide.”

Harnessing the full potential of universities will require a strategy that fosters regional networks where different types of institutions, including colleges and community colleges, can work together to create joint research and educational programs that also involve partnerships with industry.

For more than 35 years, MIT has benefitted from its Microsystems Industrial Group, which guides research and education activities, mentors students and faculty, and offers financial support. Working closely with industry helps faculty appreciate and understand problems that are interesting but also relevant, they should tackle in their research. These sorts of cross-cutting partnerships will become even more important in the future, del Alamo said.

– Edited by Chris Vavra, web content manager, Control Engineering, CFE Media and Technology, cvavra@cfemedia.com.

Semiconductors FAQ

  • How does a semiconductor work?

    A semiconductor is a material that has electrical conductivity between that of a conductor and an insulator and it's widely used in electronic devices. The electrical conductivity of a semiconductor can be controlled by introducing impurities, a process known as doping and the p-n junction is formed when a p-type and n-type semiconductor are brought into close proximity. The p-n junction is widely used in electronic devices such as diodes and transistors to control the flow of current.

  • What raw materials are used to make semiconductor chips?

    Semiconductor chips, also known as integrated circuits (ICs), are typically made from silicon, which is the most widely used semiconductor material. Other materials that can be used in the manufacturing of semiconductor chips include germanium, gallium arsenide and silicon carbide.

    Silicon is the most common raw material used in the manufacturing of semiconductor chips because it is abundant, inexpensive and has the necessary electronic properties for semiconductor applications. The raw silicon is usually obtained in the form of silicon dioxide (SiO2), which is found in natural materials such as sand and quartz. This silicon dioxide is then purified and converted into single crystal silicon through a process called the Czochralski process.

  • Why is there a semiconductor shortage?

    The global demand for semiconductors has been growing rapidly in recent years, driven by the increasing popularity of devices such as smartphones, laptops and other electronic devices, as well as the growth of new technologies such as the Internet of Things (IoT) and 5G. This has led to an increased demand for semiconductors that manufacturers have been struggling to keep up with.

    The semiconductor shortage is a complex issue with multiple contributing factors including supply chain disruptions, capacity constraints and environmental regulations. It is likely that it will take some time for the industry to catch up with the increased demand and manufacturers are implementing measures to increase their production capacity.

  • What can be used to replace a semiconductor?

    There are several alternatives to semiconductors that can be used in electronic devices, but each has its own set of advantages and disadvantages and the best alternative depends on the specific application. A few possible substitutes are:

    1. Relays: Relays are used to control the flow of electricity in a circuit by physically moving a switch. They are still used in applications where low power and high reliability are required.
    2. Memristors: Memristors are a type of passive electronic component that can be used as a replacement for transistors in some applications. They have the potential to be more energy efficient and more reliable than transistors.
    3. Superconductors: Superconductors are materials that can conduct electricity with zero resistance, which makes them highly efficient. They are used in applications such as particle accelerators and MRI machines.
    4. Organic materials: Organic materials such as carbon nanotubes, graphene and organic semiconductors can be used as alternatives to silicon in some applications. They are lightweight, flexible and have a lower manufacturing cost than traditional semiconductors.
    5. Quantum Materials: Quantum materials such as topological insulators, quantum dots and other quantum-based materials are being researched as potential alternatives to semiconductors, they could be used to create new types of electronic devices with improved performance, energy efficiency and computing power.

Some FAQ content was compiled with the assistance of ChatGPT. Due to the limitations of AI tools, all content was edited and reviewed by our content team.