• Skip to main content
  • Skip to primary sidebar

Dharmendra S. Modha

My Work and Thoughts.

  • Brain-inspired Computing
    • Collaborations
    • Videos
  • Life & Universe
    • Creativity
    • Leadership
    • Interesting People
  • Accomplishments
    • Prizes
    • Papers
    • Positions
    • Presentations
    • Press
    • Profiles
  • About Me

A Proposal for Mouse Connectivity Project

January 30, 2009 By dmodha

My colleague, Anthony Ndirango, pointed out a very interesting document that proposes to synthesize complete brainwide neuroanatomical connectivity in mouse at a mesoscopic scale within 5 years and at a cost of less than 20 million dollars.

Title: A proposal for a coordinated effort for the determination of brainwide neuroanatomical connectivity in model organisms at a mesoscopic scale

Authors: Jason W. Bohland, Caizhi Wu, Helen Barbas, Hemant Bokil, Mihail Bota, Hans C. Breiter, Hollis T. Cline, John C. Doyle, Peter J. Freed, Ralph J. Greenspan, Suzanne N. Haber, Michael Hawrylycz, Daniel G. Herrera, Claus C. Hilgetag, Z. Josh Huang, Allan Jones, Edward G. Jones, Harvey J. Karten, David Kleinfeld, Rolf Kotter, Henry A. Lester, John M. Lin, Brett D. Mensh, Shawn Mikula, Jaak Panksepp, Joseph L. Price, Joseph Safdieh, Clifford B. Saper, Nicholas D. Schiff, Jeremy D. Schmahmann, Bruce W. Stillman, Karel Svoboda, Larry W. Swanson, Arthur W. Toga, David C. Van Essen, James D. Watson and Partha P. Mitra

Abstract: In this era of complete genomes, our knowledge of neuroanatomical circuitry remains surprisingly sparse. Such knowledge is however critical both for basic and clinical research into brain function. Here we advocate for a concerted effort to fill this gap, through systematic, experimental mapping of neural circuits at a mesoscopic scale of resolution suitable for comprehensive, brain-wide coverage, using injections of tracers or viral vectors. We detail the scientific and medical rationale and briefly review existing knowledge and experimental techniques. We define a set of desiderata, including brain-wide coverage; validated and extensible experimental techniques suitable for standardization and automation; centralized, open access data repository; compatibility with existing resources, and tractability with current informatics technology. We discuss a hypothetical but tractable plan for mouse, additional efforts for the macaque, and technique development for human. We estimate that the mouse connectivity project could be completed within five years with a comparatively modest budget.

Filed Under: Brain-inspired Computing

Martin Rehn and Dileep George

January 26, 2009 By dmodha

Last Friday, we enjoyed a visit by Dr. Martin Rehn.

Title: Cell assemblies and computation in cortical networks

Abstract: Recurrent neural networks are powerful computational structures. Intractable in the general case, their power is yet to be harnessed, both for practical applications and as a model for the brain. One class of recurrent networks that is theoretically well understood is attractor memory networks. Starting from this idea, we explore extensions that have non-trivial temporal dynamics, and how they apply to sensory coding. It will also be shown how an attractor memory can operate on top of a fairly realistic cortical circuitry, with some conclusions for cortical modelling.

Bio: Martin Rehn is a postdoctoral fellow at the Redwood Center for Theoretical Neuroscience, UC Berkeley, and a Research Scientist at Google. He received a PhD in Computer Science from the Royal Institute of Technology in Stockholm in 2006 and an MSc in Engineering Physics from the same institution in 1999. He is interested in representation and computation in early sensory cortices, associative memory models, and cortical simulations.

On Dec 4, 2008, we had a spirited and wonderful talk by Dr. Dileep George who is Co-Founder and CTO of Numenta. You can find his thesis here.

Title: Towards a Mathematical Model of Cortical Circuits Based on Hierarchical Temporal Learning in the Brain

Abstract: It is well known that the neocortex is organized as a hierarchy. Hierarchical Temporal Memory is a theory of the neocortex that models the necortex using a spatio-temporal hierarchy. The HTM hierarchy is organized in such a way that the higher levels of the hierarchy incorporate larger amounts of space and longer durations of time. The states at the higher levels of the hierarchy vary at a slower rate compared to the lower levels. It is speculated that this kind of organization leads to efficient learning and generalization because it mirrors the organization of the world.

I will start this talk by demonstrating the recent advances at Numenta in using HTM for object recognition. We are able to recognize objects in clutter with a high degree of accuracy. Top-down attention based feedback is used to recognize multiple objects in a scene. Feedback is used to segment out objects from clutter.

I will then describe how the assumptions of hierarchical temporal learning can lead to a mathematical model for cortical circuits. An HTM node is abstracted using a coincidence detector and a mixture of variable memory Markov chains. Bayesian belief propagation equations on  this HTM node gives a set of operation related constraints for the cortical circuits. Anatomical and physiological data provide a second set of constraints related to organization of the circuits. The combination of these two constraints can be used to derive a set of cortical circuits that explain many anatomical and physiological features and predict several other. I will then demonstrate the application of these circuits in the modeling of the subjective contour effect.

Bio: Dileep George is the Chief Technology Officer of Numenta — a company he co-founded with Jeff Hawkins and Donna Dubinsky. His primary research interests are in understanding the organizational properties of the world and in linking that to the cortical architecture and micro-circuitry.

Dileep joined the Redwood Neuroscience Institute as a Graduate Research Fellow and began working closely with Jeff Hawkins in extending and expressing Jeff’s neuroscience theories in mathematical terms. He created the first proof-of-concept program to illustrate these concepts, which triggered the launch of Numenta in 2005. Within five months of Numenta¹s founding, Dileep and his team created the first prototype of HTM technology. Prior to his graduate studies, Dileep worked on developing algorithms for 3G wireless modems.

Dileep holds a Bachelor’s degree in Electrical Engineering from the Indian Institute of Technology in Bombay and Master’s and Ph.D degrees in Electrical Engineering from Stanford University. Dileep’s PhD thesis provides a detailed study of the hierarchical temporal learning in the neocortex.

Filed Under: Brain-inspired Computing, Interesting People

Press Coverage for IBM’s DARPA Contract Win

January 8, 2009 By dmodha

I am grateful to my IBM communications colleagues Sara Delekta-Galligan and Jenny Hunter for compiling the following list, and to Shyamal Chandra for painstakingly and conscientiously organizing.

To not overwhelm the reader, I have decided not to include 1,000s of blogs.

Featured

BBC News, 11/21/2008, IBM plans ‘brain-like’ computers. PDF

BusinessWeek, 11/20/2008, Making Computers Based on the Human Brain. PDF

Bloomberg, 11/20/2008, IBM Gets Funds for Computers Modeled After Brains. PDF

CNET, 11/19/2008, IBM gets DARPA cognitive computing contract. PDF

Defense News, 11/26/2008, DARPA seeks to mimic in silicon the mammalian brain. PDF

EDN.com, 11/21/2008, IBM seeks to simulate brain. PDF

eWEEK, 11/20/2008, IBM Researchers Looks to Bring ‘Brain Power` to Computers. PDF

Forbes, 12/8/2008, Metadata: Speaking Of IBM: What can you invent with $6 billion a year? PDF

InformationWeek, 11/20/2008, Big Blue hopes to use silicon and transistors to recreate neurons and synapses. PDF

InfoWorld, 11/20/2008, IBM Tries to Brain Brain Power to Computers. PDF

International Hearld Tribune, 11/20/2008, IBM wins funding for ‘brainy’ computer. PDF

New York Times, 11/20/2008, Hunting for a Brainy Computer. PDF

PCWorld, 11/19/2008, IBM Tries to Bring Brain Power to Computers. PDF

San Francisco Business Times, 11/20/2008, IBM, Stanford looking to build brainlike computer. PDF

San Francisco Chronicle, 11/20/2008, Artificial brain project gets boost – $4.9 million grant. PDF

ScienceDaily, 12/23/2008, Cognitive Computing: Building A Machine That Can Learn From Experience. PDF

The Guardian (UK), 11/6/2008, Will machines outsmart man? PDF

Complete Coverage

Abo-Szene (Germany), 11/20/2008, IBM developed with universities computer systems modeled after the human brain. PDF

Absolute Gadget.com, 11/21/2008, IBM to make brain-like computer. PDF

AdNews (Brazil), 11/25/2008, IBM will create simulators for the human brain. PDF

ArabianBusiness.com, 11/23/2008, IBM to build computers based on human brain. PDF

BFM, 11/24/2008, I-Brain in ten years. PDF

BoerseGo.de (Germany), 11/20/2008, IBM developed with universities computer systems modeled after the human brain. PDF

Canal TI (Peru), 12/3/2008, Computer based on cognitive abilities of the brain. PDF

Clubic.com, 11/24/2008, IBM is working on cognitive computing. PDF

ComputerWeekly (UK), 11/20/2008, IBM is seeking to build the computer of the future based on the efficient way the brain works. PDF

Computerwoche (Germany), 11/21/2008, Cognitive computing uses the brain as a model. PDF

Custompc.co.uk (UK), 11/21/2008, IBM Plans to build virtual brain. PDF

Daily News & Analysis (India), 11/24/2008, Computers to mimic brain. PDF

Daycohost (Venezuela), 11/24/2008, The Future PC will be based on cognitive skills of the brain. PDF

Die News (Germany), 11/20/2008, IBM developed with universities computer systems modeled after the human brain. PDF

DigitalTrends.com, 11/24/2008, IBM Building Brain-Like Computer. PDF

El Heraldo (Columbia), 11/24/2008, The human brain comes to computers. PDF

El Universal on line, La crónica.com, Publimetro, Canal 22, Diario TI.com (Mexico), 11/24/08, Seeking to create artificial brain. PDF

Engadget, 11/21/2008, DARPA enlists IBM to build computer brain. PDF

Entornointeligente.com (Venezuela), 12/18/2008, PC’s of the future are based cognitive abilities. PDF

Heise-Online (UK), 11/21/2008, IBM Plans to copy the brain. PDF

Heisse (Germany), 11/20/2008, Researchers to built human brain-like computers. PDF

Il Sole Nòva (Italy), 12/4/2008, The computer with the human thought. PDF

Information.de (Germany), 11/20/2008, IBM developed with universities computer systems modeled after the human brain. PDF

IT Pro (UK), 11/24/2008, IBM and academia work on human brain simulation. PDF

IT Sitio (Argentina), 11/21/2008, IBM launches “cognitive computing”. PDF

ITWeb.com (South Africa), 11/20/2008, IBM gets cognitive computing contract. PDF

Jakarta News, 11/22/2008, IBM to build computer circuits that mimic brains. PDF

La crónica de hoy.com, Tech:consumer.com, compuguía.com, Publimetro, Terra, W

Radio (Mexico), 11/27/08, Seeking to create artificial brain. PDF

La Cuarta Newspaper (Chile), 11/30/2008, Computers are ready to learn and manage data just like I Robot movie. PDF

La Nación (Argentina), 12/5/2008, A computer that emulates human brains. PDF

La Repubblica (Italy), 11/24/2008, Here is the supercomputer that reasons. PDF

Lemondeinformatique.fr, 11/20/2008, IBM works on a synaptic computer under the aegis of the Darpa. PDF

Manufacturing Computer Solutions (UK), 11/21/2008, IBM to build brain-based computer of the future. PDF

Nanotechnology Now, 11/22/2008, IBM to Build “Thinking” Computers Modeled on the Brain. PDF

NetworkWorld, 11/20/2008, IBM Tries to Bring Brain Power to Computers. PDF

New Factor Network, 11/21/2008, IBM, Partners Aim To Build Brain-Like Computer Systems. PDF

O-Sul (Brazil), 11/21/2008, I, Robot. PDF

Premiumpresse.de (Germany), 11/20/2008, IBM developed with universities computer systems modeled after the human brain. PDF

PresseEcho.de (Germany), 11/20/2008, IBM developed with universities computer systems modeled after the human brain. PDF

Pressetext Schweiz (Germany), 11/21/2008, Cognitive computing uses the brain as a model. PDF

Pressrelations (Germany), 11/20/2008, IBM developed with universities computer systems modeled after the human brain. PDF

R&D magazine, 11/21/2008, IBM aims to harness human brain architecture. PDF

RedOrbit.com, 11/21/2008, IBM To Develop Computer With Brain-Like Tendencies. PDF

ReportYou.com (Germany), 11/20/2008, IBM developed with universities computer systems modeled after the human brain. PDF

Revista da Semana (Brazil), 12/1/2008, Electronic brain wants to think. PDF

Silicon.com (UK), 11/12/2008, IBM lands Darpa ‘build a brain’ contract. PDF

SupercompuingOnline.com, 12/18/2008, Cognitive computing: Building a machine that can learn from experience. PDF

Tech:consumer.com, 12/2/2008, IBM’s computer of the future. PDF

TechNewsWorld, 11/20/2008, IBM, Academics Seek to Create a Computer That’s More Like Us. PDF

Technogadge, 11/23/2008, IBM to build Smart Computers that functions like Human Brain. PDF

TechRadar.com, 11/24/2008, IBM building ‘brain simulator’ computer. PDF

The Inquirer (UK), 11/20/2008, IBM wants to build a computer based on the brain. PDF

TheFutureofThings.com, 11/22/2008, IBM to Build Cat-like Brain. PDF

Time Herald Record, 11/24/2008, Big Blue teaming up with universities. PDF

Valor Econômico (Brazil), 11/25/2008, New computers seek to reproduce brain functions. PDF

VentureBeat.com, 11/19/2008, IBM aims to replicate the brain with cognitive computing project. PDF

VNUnet.com, 11/22/2008, IBM to develop super-brainy computer. PDF

Vnunet.fr, 11/24/2008, IBM is going to invent a new sort of computer. PDF

Washington Technology, 11/21/2008, IBM gets funding for cognitive computing research. PDF

Winfuture (Germany), 11/20/2008, Researchers want to build brain-like computer. PDF

Yahoo! News India, 12/26/2008, Indian Origin researcher making computer that thinks like mammalian brain. PDF

Radio

Italy, Radio BIT, 11/30/2008, Giorgio Richelli on "cognitive computing".

NRM Comunicaciones Radio (Mexico), 12/15/2008, IBM develops cognitive computing. PDF

Select Blogs

Discover magazine blog, 11/21/2008, IBM to Build “Thinking” Computers Modeled on the Brain. PDF

Reuters (UK) MediaFile Blog, 11/20/2008, IBM-led computing effort seeks to mimic brain’s ability. PDF

ZDNet blog, 11/22/2008, Thoughts and Theories From Roger Andre. PDF

Filed Under: Accomplishments, Brain-inspired Computing, Press

IBM Awarded DARPA funding via SyNAPSE Program

November 19, 2008 By dmodha

IBM has officially announced that our proposal “Cognitive Computing via Synaptronics and Supercomputing (C2S2)” won the first phase of DARPA’s Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) initiative. Please see DARPA’s BAA here.

Some Snippets from the BAA:

“Proposed research should investigate innovative approaches that enable revolutionary advances in neuromorphic electronic devices that are scalable to biological levels.  Specifically excluded is research that primarily results in evolutionary improvements to the existing state of practice.”

“Over six decades, modern electronics has evolved through a series of major developments (e.g., transistors, integrated circuits, memories, microprocessors) leading to the programmable electronic machines that are ubiquitous today.  Owing both to limitations in hardware and architecture, these machines are of limited utility in complex, real-world environments, which demand an intelligence that has not yet been captured in an algorithmic-computational paradigm. As compared to biological systems for example, today’s programmable machines are less efficient by a factor of one million to one billion in complex, real-world environments.  The SyNAPSE program seeks to break the programmable machine paradigm and define a new path forward for creating useful, intelligent machines.”

“The vision for the anticipated DARPA SyNAPSE program is the enabling of electronic neuromorphic machine technology that is scalable to biological levels.  Programmable machines are limited not only by their computational capacity, but also by an architecture requiring (human-derived) algorithms to both describe and process information from their environment.  In contrast, biological neural systems (e.g., brains) autonomously process information in complex environments by automatically learning relevant and probabilistically stable features and associations.  Since real world systems are always many body problems with infinite combinatorial complexity, neuromorphic electronic machines would be preferable in a host of applications—but useful and practical implementations do not yet exist.”

What does the DARPA Award mean?

DARPA has provided mission, money, mandate, meaning, motivation, and metrics that are indispensable to such an ambitious undertaking and to bring a wide-ranging, interdisciplinary group of researchers together.

DARPA has a rich history of initiating and supporting profound technological breakthroughs, including the internet, please see. We are profoundly grateful to DARPA and to the program manager, Dr. Todd Hylton, who created and manages the SyNAPSE program.

Who are IBM’s collaborators?

IBM SyNAPSE Team

I am proud to serve as the Principal Investigator for a truly star-studded cast that is comprehensive, creative, and committed!

IBM Almaden Research Center (Dr. Stuart Parkin, Dr. Bulent Kurdi, Dr. J. Campbell Scott, Dr. Paul Maglio, Dr. Simone Raoux, Dr. Rajagopal Ananthanarayanan, Dr. Raghavendra Singh)

IBM T. J. Watson Research Center (Dr. Chung Lam and Dr. Bipin Rajendran)

Stanford University (Professors Kwabena Boahen, H. Philip Wong, Brian Wandell)

University of Wisconsin-Madison (Professor Giulio Tononi)

Cornell University (Professor Rajit Manohar)

Columbia University Medical Center (Professor Stefano Fusi)

University of California-Merced (Professor Christopher Kello)

And, many other researchers, post-docs, and students.

What are the short-term, medium-term, and long-term goals?

Please see the metrics in the BAA here. In the next 9 months, we will focus on demonstrating nano-scale, low power synapse-like devices and on beginning to uncover the functional microcircuits of the brain.

Why the focus on synapse?

Synapses are junctions between neurons. In mouse and rat brains, there are roughly 10,000 times more synapses in the brain than neurons. Strength/efficacy/efficiency of synapses is subject to change (plasticity) as the animal interacts with the environment, and these synaptic junctions are hyothesized to encode our individual experience. The computation, communication, memory, power, and space requirements for representing brain in software or hardware seem to scale with the number of syanpses. Thus, brain is much less a neural network, and more correctly, a synaptic network.

Who else is funded?

For publicly available information, please see.

What is Cognitive Computing?

There is no definition or specification of the human mind. But, we understand it as a collection of processes of sensation, perception, action, cognition, emotion, and interaction. Yet, the mind seems to integrate sight, hearing, touch, taste, and smell effortlessly into a coherent whole, and to act in a context-dependent way in a changing, uncertain environment. The mind effortless creates categories of time, space, and object, and interrelationships between these.

The mind arises from the wetware of the brain. Thus, it would seem that reverse engineering the computational function of the brain is perhaps the cheapest and quickest way to engineer computers that mimic the robustness and versatility of the mind.

So cognitive computing is the quest to engineer mind-like intelligent business machines by reverse engineering the computational function of the brain.

What is the difference between Cognitive Computing and AI?

The field of artificial intelligence research has focused on individual aspects of engineering intelligent machines.  So, it proceeds in problem-first, algorithm-later fashion. Even AI pioneers John Anderson and Allen Newell have both argued for a need for single unified theory of cognition. Cognitive computing, seeks to engineer holistic intelligent machines that neatly tie together all of the pieces. Cognitive computing seeks to uncover the core micro and macro circuits of the brain underlying a wide variety of abilities. So, it aims to proceeds in algorithm-first, problems-later fashion.

I believe that spiking computation is a key to achieving this vision. For a previous discussion of neural code, please see here.

Why is the time ripe now?

In my opinion, there are three reasons why the time is now ripe to begin to draw inspiration from structure, dynamics, function, and behavior of the brain for developing novel computing architectures and cognitive systems. First, neuroscience now seems to have matured, and enough quantitative data is available for formulating hypotheses of brain function and dynamics. Second, supercomputing is now ready to undertake extremely large-scale simulations. Third, nanotechnology is evolving to the point that we may be able to represent essential computational function of synapses and neurons in hardware to rival brain’s power and space.

Taking an engineering mindset, time is now perfect to begin to build cognitive computing chips. The very process of building will bring us face-to-face with the unknowns and with it the potential for forward progress.

What is von Neumann Architecture? Why is brain different?

The key element of von Neumann architecture is separation of memory and computation.  This leads to von Neumann bottleneck – a term coined by John Backus (who was an IBMer) in his 1972 Turing Lecture:

“Surely there must be a less primitive way of making big changes in the store than by pushing vast numbers of words back and forth through the von Neumann bottleneck. Not only is this tube a literal bottleneck for the data traffic of a problem, but, more importantly, it is an intellectual bottleneck that has kept us tied to word-at-a-time thinking instead of encouraging us to think in terms of the larger conceptual units of the task at hand. Thus programming is basically planning and detailing the enormous traffic of words through the von Neumann bottleneck, and much of that traffic concerns not significant data itself, but where to find it.”

In contrast, in the brain, memory and computation are distributed throughout the fabric, and there is no von Neumann bottleneck.  It is an entirely different paradigm for computing, and we, as a civilization, have not yet tapped into this secret of mother nature.

Can you tell us about your rat-scale simulation?

Please see the actual paper here and detailed discussion here. Please note – it is not a rat brain!

What happens if you succeed?

If we succeed, then we will be able to give birth to novel cognitive systems, computing architectures, programming paradigms and numerous practical applications and perhaps to entirely new industries. IBM press release captures it nicely. “The end goal: ubiquitously deployed computers imbued with a new intelligence that can integrate information from a variety of sensors and sources, deal with ambiguity, respond in a context-dependent way, learn over time and carry out pattern recognition to solve difficult problems based on perception, action and cognition in complex, real-world environments.”

How soon?

While the possibilities are exciting, we are just beginning and the path is long, arduous, uncertain, and achieving the end goal will require crossing many hurdles and numerous technical breakthroughs. Many challenges will become apparent as we build it. Stay tuned and help us get there by joining the field as there are a number of exciting inventions and discoveries awaiting the interested and prepared mind.

The end goal is inevitable, but is unpredictable.

How do you feel?

I feel a profound sense of gratitude to DARPA for their vision and for choosing our proposal, to IBM for its depth, breadth, and innovative culture, and my fellow team members for their collaboration. I also feel a deep sense of optimism about our ability to innovate. At the same time, I feel a palpable sense of responsibility to seize this opportunity to truly make meaningful positive forward progress given the obvious time and money constraints.

Personal Cognitive Computing Milestones:

In September 2005, proposal to organize 2006 Almaden Institute around Cognitive Computing was selected by Dr. Mark Dean who then headed IBM’s Almaden Research Center and by four department heads (Dr. Dilip Kandlur, Dr. Laura Haas, Dr. Gian-Luca Bona, and Dr. Jim Spohrer).

On March 31, 2006, grand challenge proposal to start a project on Cognitive Computing was funded by IBM Research.

On May 10-11, 2006, a very successful Almaden Institute on Cognitive Computing took place. Please see here and here.

On August 16, 2006, Cognitive Computing group was organized, and I became its manager.

On April 27, 2007, BBC News reported my group’s work on a half-mouse-scale simulation in near real-time.

On May 2-3, 2007, I co-chaired Cognitive Computing 2008 at UC Berkeley.

On May 20-21, 2007, I spoke at the Decade of the Mind Symposium organized by Jim Olds and James Albus.

On July 17, 2007, PC Magazine carried a cover story on Cognitive Computing.

On September 2007, I co-authored a letter in Science calling for a Decade of the Mind Initiative along with truly distinguished colleagues who also spoke at the Decade of the Mind Symposium.

On November 10-16, 2007, we presented our work on rat-scale simulation at Supercomputing 2007 conference. See here.

Filed Under: Accomplishments, Brain-inspired Computing, Press

NAE Grand Challenges for Engineering: Moving to Action

November 14, 2008 By dmodha

National Academy of Engineering’s "Grand Challenges for Engineering project is designed to spark public discussion and awareness that engineering is essential to addressing current and emerging societal issues. Meeting the challenges will require the support of the public and policy makers. As we stand about a month before the U.S. presidential election, we hope to inspire an informed conversation about the hurdles of public backing and policy issues that stand in the way of addressing the Grand Challenges for Engineering."

On Oct 6, 2008, they hosted a very interesting event that is available via podcast.

Needless to say, one of the grand challenges is: "Reverse-engineer the brain".

Filed Under: Brain-inspired Computing

  • « Go to Previous Page
  • Page 1
  • Interim pages omitted …
  • Page 29
  • Page 30
  • Page 31
  • Page 32
  • Page 33
  • Interim pages omitted …
  • Page 50
  • Go to Next Page »

Primary Sidebar

Recent Posts

  • A Scalable NorthPole System with End-to-End Vertical Integration for Low-Latency and Energy-Efficient LLM Inference
  • PNAS: Can neuromorphic computing help reduce AI’s high energy cost?
  • Computer History Museum Interview
  • EE Times Interview by Sunny Bains
  • SiLQ: Simple Large Language Model Quantization-Aware Training

Copyright © 2025