Clean baseline implementation of PPO using an episodic TransformerXL memory
-
Updated
Jun 18, 2024 - Python
Clean baseline implementation of PPO using an episodic TransformerXL memory
A minimalist MVP demonstrating a simple yet profound insight: aligning AI memory with human episodic memory granularity. Shows how this single principle enables simple methods to rival complex memory frameworks for conversational tasks.
Code for paper "Episodic Memory Deep Q-Networks" (https://arxiv.org/abs/1805.07603), IJCAI 2018
Lu, Q., Hasson, U., & Norman, K. A. (2022). A neural network model of when to retrieve and encode episodic memories. eLife
A Python(PyTorch) implementation of memory augmented neural network based on Ritter et al. (2018). Been There, Done That: Meta-Learning with Episodic Recall. ICML.
PyTorch implementation of Episodic Meta Reinforcement Learning on variants of the "Two-Step" task. Reproduces the results found in three papers. Check the ReadMe for more details!
NaQ: Leveraging Narrations as Queries to Supervise Episodic Memory. CVPR 2023.
Psifr: Analysis and visualization of free recall data
A machine with human-like memory systems.
Continual Learning methods using Episodic Memory (CLEM) in PyTorch
Code for exploring how we distribute our thoughts over time when we remember, using data from a naturalistic memory experiment.
The OpenAI-Gym-compatible Room environment.
Memory-centric architecture for scalable, energy-efficient AI agents and long-term cognition.
Memory that enables AI agents to learn from past failures without weight updates
A research-oriented Python framework implementing the DREAM architecture for long-term episodic memory in AI agents.
🧠 Proprietary Multimodal Cognitive Architecture - SpikingBrain + Long-VITA + Whisper with dynamic context, fusion layers, hybrid memory, and adaptive personality. Not a chatbot. A cognitive mind.
The content of this repository will be inherent to the Computational Intelligence course at Polytechnic University of Turin academic year 2023/2024
Official implementation of the paper "Linking In-context Learning in Transformers to Human Episodic Memory" by Li Ji-An, Corey Zhou, Marcus Benna, and Marcelo Mattar
A local-first cognitive architecture for AI agents separating Episodic (events) from Semantic (facts) memory. Features provenance tracking, defense-in-depth LLM sanitization, and multilingual support via Qwen-2.5 + BGE-M3.
Pepper implementation of the cognitive architecture for Trust and Theory of Mind in humanoid robots applied to Vanderbilt's experiment.
Add a description, image, and links to the episodic-memory topic page so that developers can more easily learn about it.
To associate your repository with the episodic-memory topic, visit your repo's landing page and select "manage topics."