This guide explains how to migrate OpenClaw agents away from Anthropic's Claude models to open-source alternatives hosted on Hugging Face or running locally.
EVA is a new end-to-end evaluation framework for conversational voice agents that jointly measures task accuracy and conversational experience.
This post presents a pipeline to fine-tune a domain-specific embedding model in under a day on a single GPU without manual labeling.
IBM Research releases Mellea 0.4.0 and three Granite Libraries for building structured, verifiable, and safety-aware AI workflows on IBM Granite models.
This post examines how the open source AI ecosystem on Hugging Face evolved across competition, geography, and emerging communities through Spring 2026.
H Company releases Holotron-12B, a multimodal computer-use agent model post-trained from NVIDIA's Nemotron-Nano-2 VL, optimized for high-throughput production inference.
This post surveys 16 open-source reinforcement learning libraries to understand how they implement asynchronous training architectures that decouple inference from training.
Hugging Face introduces Storage Buckets, a mutable S3-like object storage system on the Hub designed for intermediate ML artifacts such as checkpoints, optimizer states, and processed datasets.
LeRobot v0.5.0 is the largest release to date, expanding hardware support, policies, datasets, and the codebase simultaneously.
This post explains Ulysses Sequence Parallelism (SP), a technique from Snowflake AI Research for training LLMs on sequences up to millions of tokens by distributing attention computation across GPUs.
This article presents NXP's best practices for deploying Vision-Language-Action (VLA) models on the i.MX 95 embedded SoC, covering dataset recording, fine-tuning, and on-device optimization.
Modular Diffusers introduces a composable block-based approach to building diffusion pipelines, replacing monolithic pipeline classes with reusable, swappable components.