• Skip to main content
  • Skip to primary sidebar

Dharmendra S. Modha

My Work and Thoughts.

  • Brain-inspired Computing
    • Collaborations
    • Videos
  • Life & Universe
    • Creativity
    • Leadership
    • Interesting People
  • Accomplishments
    • Prizes
    • Papers
    • Positions
    • Presentations
    • Press
    • Profiles
  • About Me

Archives for March 2011

Early Tagging of Cortical Networks Is Required for the Formation of Enduring Associative Memory

March 7, 2011 By dmodha

I read a thought-provoking article in Science (18 Feb 2011) by Edith Lesburguères, Oliviero L. Gobbo, Stéphanie Alaux-Cantin, Anne Hambucken, Pierre Trifilieff, and Bruno Bontempi.

Abstract: Although formation and stabilization of long-lasting associative memories are thought to require time-dependent coordinated hippocampal-cortical interactions, the underlying mechanisms remain unclear. Here, we present evidence that neurons in the rat cortex must undergo a “tagging process” upon encoding to ensure the progressive hippocampal-driven rewiring of cortical networks that support remote memory storage. This process was AMPA- and N-methyl-D-aspartate receptor–dependent, information-specific, and capable of modulating remote memory persistence by affecting the temporal dynamics of hippocampal-cortical interactions. Post-learning reinforcement of the tagging process via time-limited epigenetic modifications resulted in improved remote memory retrieval. Thus, early tagging of cortical networks is a crucial neurobiological process for remote memory formation whose functional properties fit the requirements imposed by the extended time scale of systems-level memory consolidation.

Filed Under: Brain-inspired Computing

Primary Sidebar

Recent Posts

  • A Scalable NorthPole System with End-to-End Vertical Integration for Low-Latency and Energy-Efficient LLM Inference
  • PNAS: Can neuromorphic computing help reduce AI’s high energy cost?
  • Computer History Museum Interview
  • EE Times Interview by Sunny Bains
  • SiLQ: Simple Large Language Model Quantization-Aware Training

Copyright © 2025