Eve 2 family of models; 272M parameter MoE IT Specialist Agents
Anthony Maio PRO
anthonym21
AI & ML interests
- Interpretability-oriented analysis of model behavior/welfare
- Heterogenous Model Swarms in Scalable Oversight / Automated Red Teaming
- Evaluation of internal consistency and failure modes
- Bio-mimetic memory and state management for long-running LLM agents
- Tooling to make agent behavior measurable and auditable
Recent Activity
published
an
article
about 12 hours ago
CoDA-GQA-L: Bounded-Memory Differential Attention with Value-Routed Landmark Banks
updated
a model
about 17 hours ago
anthonym21/Eve-2-MoE-272M
updated
a model
2 days ago
anthonym21/Eve-2-MoE-IT-272M