Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
An in-memory computing architecture based on two-dimensional semiconductors for multiply-accumulate operations
by
Ma, Shunli
, Sun, Qingqing
, Chen, Lin
, Wang, Yin
, Sun, Zhengzong
, Xie, Yufeng
, Wan, Jing
, Zhou, Peng
, Zhu, Hao
, Tang, Hongwei
, Xu, Zihan
, Bao, Wenzhong
, Zhang, David Wei
, Chen, Xinyu
in
639/925/357/1018
/ 639/925/927/1007
/ 639/925/929/115
/ Algorithms
/ Artificial intelligence
/ Capacitors
/ Computation
/ Computer architecture
/ Deep learning
/ Dynamic random access memory
/ Electric potential
/ Energy conversion efficiency
/ Energy efficiency
/ Humanities and Social Sciences
/ Leakage current
/ Learning algorithms
/ Logic circuits
/ Machine learning
/ Molybdenum disulfide
/ multidisciplinary
/ Multiplication
/ Neural networks
/ Object recognition
/ Random access memory
/ Retention time
/ Science
/ Science (multidisciplinary)
/ Semiconductor devices
/ Transistors
/ Voltage
2021
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
An in-memory computing architecture based on two-dimensional semiconductors for multiply-accumulate operations
by
Ma, Shunli
, Sun, Qingqing
, Chen, Lin
, Wang, Yin
, Sun, Zhengzong
, Xie, Yufeng
, Wan, Jing
, Zhou, Peng
, Zhu, Hao
, Tang, Hongwei
, Xu, Zihan
, Bao, Wenzhong
, Zhang, David Wei
, Chen, Xinyu
in
639/925/357/1018
/ 639/925/927/1007
/ 639/925/929/115
/ Algorithms
/ Artificial intelligence
/ Capacitors
/ Computation
/ Computer architecture
/ Deep learning
/ Dynamic random access memory
/ Electric potential
/ Energy conversion efficiency
/ Energy efficiency
/ Humanities and Social Sciences
/ Leakage current
/ Learning algorithms
/ Logic circuits
/ Machine learning
/ Molybdenum disulfide
/ multidisciplinary
/ Multiplication
/ Neural networks
/ Object recognition
/ Random access memory
/ Retention time
/ Science
/ Science (multidisciplinary)
/ Semiconductor devices
/ Transistors
/ Voltage
2021
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
An in-memory computing architecture based on two-dimensional semiconductors for multiply-accumulate operations
by
Ma, Shunli
, Sun, Qingqing
, Chen, Lin
, Wang, Yin
, Sun, Zhengzong
, Xie, Yufeng
, Wan, Jing
, Zhou, Peng
, Zhu, Hao
, Tang, Hongwei
, Xu, Zihan
, Bao, Wenzhong
, Zhang, David Wei
, Chen, Xinyu
in
639/925/357/1018
/ 639/925/927/1007
/ 639/925/929/115
/ Algorithms
/ Artificial intelligence
/ Capacitors
/ Computation
/ Computer architecture
/ Deep learning
/ Dynamic random access memory
/ Electric potential
/ Energy conversion efficiency
/ Energy efficiency
/ Humanities and Social Sciences
/ Leakage current
/ Learning algorithms
/ Logic circuits
/ Machine learning
/ Molybdenum disulfide
/ multidisciplinary
/ Multiplication
/ Neural networks
/ Object recognition
/ Random access memory
/ Retention time
/ Science
/ Science (multidisciplinary)
/ Semiconductor devices
/ Transistors
/ Voltage
2021
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
An in-memory computing architecture based on two-dimensional semiconductors for multiply-accumulate operations
Journal Article
An in-memory computing architecture based on two-dimensional semiconductors for multiply-accumulate operations
2021
Request Book From Autostore
and Choose the Collection Method
Overview
In-memory computing may enable multiply-accumulate (MAC) operations, which are the primary calculations used in artificial intelligence (AI). Performing MAC operations with high capacity in a small area with high energy efficiency remains a challenge. In this work, we propose a circuit architecture that integrates monolayer MoS
2
transistors in a two-transistor–one-capacitor (2T-1C) configuration. In this structure, the memory portion is similar to a 1T-1C Dynamic Random Access Memory (DRAM) so that theoretically the cycling endurance and erase/write speed inherit the merits of DRAM. Besides, the ultralow leakage current of the MoS
2
transistor enables the storage of multi-level voltages on the capacitor with a long retention time. The electrical characteristics of a single MoS
2
transistor also allow analog computation by multiplying the drain voltage by the stored voltage on the capacitor. The sum-of-product is then obtained by converging the currents from multiple 2T-1C units. Based on our experiment results, a neural network is ex-situ trained for image recognition with 90.3% accuracy. In the future, such 2T-1C units can potentially be integrated into three-dimensional (3D) circuits with dense logic and memory layers for low power in-situ training of neural networks in hardware.
In standard computing architectures, memory and logic circuits are separated, a feature that slows matrix operations vital to deep learning algorithms. Here, the authors present an alternate in-memory architecture and demonstrate a feasible approach for analog matrix multiplication.
Publisher
Nature Publishing Group UK,Nature Publishing Group,Nature Portfolio
This website uses cookies to ensure you get the best experience on our website.