Intelligent matter endows reconfigurable temperature and humidity sensations for in-sensor computing

  • Tao Guo
  • , Jiawei Ge
  • , Yixuan Jiao
  • , Youchao Teng
  • , Bai Sun
  • , Wen Huang
  • , Hatameh Asgarimoghaddam
  • , Kevin P. Musselman
  • , Yin Fang
  • , Y. Norman Zhou
  • , Yimin A. Wu

Research output: Contribution to journalArticlepeer-review

20 Scopus citations

Abstract

Data-centric tactics with in-sensor computing go beyond the conventional computing-centric tactic that is suffering from processing latency and excessive energy consumption. The multifunctional intelligent matter with dynamic smart responses to environmental variations paves the way to implement data-centric tactics with high computing efficiency. However, intelligent matter with humidity and temperature sensitivity has not been reported. In this work, a design is demonstrated based on a single memristive device to achieve reconfigurable temperature and humidity sensations. Opposite temperature sensations at the low resistance state (LRS) and high resistance state (HRS) were observed for low-level sensory data processing. Integrated devices mimicking intelligent electronic skin (e-skin) can work in three modes to adapt to different scenarios. Additionally, the device acts as a humidity-sensory artificial synapse that can implement high-level cognitive in-sensor computing. The intelligent matter with reconfigurable temperature and humidity sensations is promising for energy-efficient artificial intelligence (AI) systems.

Original languageEnglish
Pages (from-to)1030-1041
Number of pages12
JournalMaterials Horizons
Volume10
Issue number3
DOIs
StatePublished - 4 Jan 2023

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 7 - Affordable and Clean Energy
    SDG 7 Affordable and Clean Energy

Fingerprint

Dive into the research topics of 'Intelligent matter endows reconfigurable temperature and humidity sensations for in-sensor computing'. Together they form a unique fingerprint.

Cite this