What if neuromorphic computers used neurotransmitters? In the latest issue of Device, Kai Xiao & co-workers summarize progress in using dopamine as a signaling modality in bioinspired computing that improves artificial intelligence applications by better emulating the human brain! Dopamine detection and integration in neuromorphic devices for applications in artificial intelligence by Kai Xiao and co-workers Link (OA): https://lnkd.in/gKKD9HxM The bigger picture: Neuromorphic devices play a pivotal role in reshaping artificial intelligence, brain-like computing, and neuroprostheses by emulating the intricate computational processes of the human brain. However, current neuromorphic technologies primarily rely on electrical signals, overlooking the indispensable role of neurotransmitter-mediated chemical signaling in biological organisms. This review explores methodologies for detecting the key neurotransmitter dopamine, including microdialysis, optical, and electrochemical techniques. The exploration extends to the integration of these detection methods with neuromorphic devices, highlighting the imperative to comprehend both electrical and chemical signaling for precise emulation of neural processes. By bridging this gap, we can unlock the full potential of neuromorphic technology, resulting in more sophisticated and human-like artificial intelligence systems.
Trends in Brain-Like Electronic Circuits
Explore top LinkedIn content from expert professionals.
Summary
Brain-like electronic circuits, or neuromorphic computing, aim to replicate the human brain's complex functions using advanced technologies like neurotransmitter-based signaling and synaptic transistors. These innovations are transforming artificial intelligence and machine learning by enabling more efficient, adaptive, and human-like computation.
- Explore neurotransmitter-based technologies: Investigate how chemical signaling, like dopamine, can enhance neuromorphic devices for more accurate biological brain emulation.
- Adopt synaptic transistors: Utilize devices that combine processing and memory storage, operate at room temperature, and mimic cognitive functions such as learning and pattern recognition.
- Focus on energy-efficient systems: Prioritize designs that reduce power consumption and improve operational stability to meet the demands of real-world AI and machine learning applications.
-
-
NEW Brain-like Transistor Mimics Human Intelligence A groundbreaking synaptic transistor inspired by human brain - can simultaneously process & store information, mimicking brain’s capacity for higher-level #thinking. Unlike previous brain-like computing #devices, 👉 New moiré synaptic transistor: • recognizes similar patterns, even when given incomplete input, demonstrating associative #memory and LEARNING • remains stable at room temperature (!), operates efficiently, • consumes minimal #energy, • retains stored information even when powered off, making it suitable for REAL-WORLD application. Designed by researchers @ Northeastern U, Boston College, & MIT, 👉 New synaptic #transistor: enables efficient compute-in-memory designs and edge hardware accelerators for AI and #machinelearning. Nature | December 20, 2023 -- More Info & Links in Comments ----------------------------- Victor Yang FRCSC MASc MD PhD PEng, Zhiren(Isaac) Zheng, Vinod Sangwan, Justin Qian, Xueqiao Wang, Stephanie Liu, Kenji Watanabe, Takashi Taniguchi, Su-Yang Xu, Pablo Jarillo-Herrero, Qiong Ma, Mark Hersam. National Science Foundation (NSF) McCormick School of Engineering, Massachusetts Institute of Technology, National Institute for Materials Science, Harvard University, CIFAR #artificialintelligence #nanotechnology #innovation #future #startups #mit #harvard #quantum #technology #ai #engineering #computing #neurosciences #linkedin #news #human #neurotech #intelligence #womeninscience #womenintech #bci #science #communication Credits: Amanda Morris, Northwestern University
-
An exciting week for #neuromorphic computing and decreasing the compute power required for #AI and #ML! For more on this topic, see my previous post: https://lnkd.in/g3EeG3Ku https://lnkd.in/gia2EVK2 Researchers report the creation of the first #roomtemperature, #lowpower (20 pW) moiré #synaptic #transistor. It is #graphene based. "The asymmetric gating in dual-gated moiré heterostructures realizes diverse biorealistic neuromorphic functionalities, such as reconfigurable synaptic responses, spatiotemporal-based tempotrons and Bienenstock–Cooper–Munro input-specific adaptation. In this manner, the moiré synaptic transistor enables efficient compute-in-memory designs and #edgehardware accelerators for #artificialintelligence and #machinelearning. Key points: Design and Material Composition: The synaptic transistor is designed to mirror human brain function, in its ability to process and store information concurrently. This mimics the brain's capability for higher-level cognition. The transistor combines two atomically thin materials – bilayer #graphene and hexagonal boron nitride – arranged in a moiré pattern to achieve its #neuromorphic functionality. This innovative structure enables the device to perform associative #learning and recognize patterns, even with imperfect input. Cognitive Functionality: The device’s ability to perform associative learning and pattern recognition, even with imperfect inputs, represents a step towards replicating higher-level cognitive functions in artificial intelligence systems. This research provides a foundation for the development of more efficient, brain-like AI systems, potentially transforming how information processing and memory storage are approached in silico. Operational Stability and Efficiency: Unlike previous brain-like computing devices that required #cryogenic temperatures to function, this new device operates stably at room temperature. It demonstrates fast operational speeds, low energy consumption, and the ability to retain stored information even when power is removed, making it highly applicable for real-world use. Implications for AI and ML: This highlights a shift away from traditional #transistor-based computing towards more energy-efficient and capable systems for AI and ML tasks. This development addresses the high power consumption issue prevalent in conventional #digitalcomputing systems, where separate processing and storage units create bottlenecks in data-intensive tasks. Original article in Nature: https://lnkd.in/gSvyUyYK