Optimization of Self-Heating Driven Leakage Current Properties of Gate-All-Around Field-Effect Transistors Using Neural Network Modeling and Genetic Algorithm
As the technology nodes of semiconductor devices have become finer and more complex, progressive scaling down has been implemented to achieve higher densities for electronic devices. Thus, three-dimensional (3D) channel field-effect transistors (FETs), such as fin-shaped FETs (FinFETs) and gate-all-...
Guardado en:
Autores principales: | , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
MDPI AG
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/a6d62d0d609f47049321d30185101205 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Sumario: | As the technology nodes of semiconductor devices have become finer and more complex, progressive scaling down has been implemented to achieve higher densities for electronic devices. Thus, three-dimensional (3D) channel field-effect transistors (FETs), such as fin-shaped FETs (FinFETs) and gate-all-around FETs (GAAFETs), have become popular as they have increased effective surface areas for the channels (<i>W<sub>eff</sub></i>), owing to the scaling down strategy. These 3D channel FETs, which have completely covered channel structures with gate oxide and metal, are prone to the self-heating effect (SHE). The SHE is generally known to degrade the on-state drain current; however, when AC pulsed inputs are applied to these devices, the SHE also degrades the off-state leakage current during the off-phase of the pulse. In this study, an optimization methodology to minimize leakage current generation by the SHE is examined. |
---|