Open-source browser automation for AI agents. Describe what you need in plain English - get a working scraper in your codebase. Run locally or scale with cloud.
Tell your agent in plain language: what site, what data, what actions. No selectors, no config files.
A real browser session launches. The agent navigates, finds elements, and builds smart selectors that work across pages, not brittle XPaths.
A production-ready scraper is generated directly in your project. Run it again anytime, or edit it like any other code.
Generated scrapers are real code files. Import them, run them, version them, modify them.
import { steer } from "opensteer";
const scraper = await steer("hn-top-posts");
const posts = await scraper.run();
// -> [{ title: "Show HN: ...", points: 342 }, ...]Selectors are cached intelligently, not just XPath. They work across different pages on the same domain.
Watch your agent navigate live. See exactly what it sees as it interacts with pages.
Everything runs locally with the OSS package. Or use cloud infra when you need to scale up.
Cloud infrastructure handles CAPTCHAs and bot detection out of the box. No proxy juggling.
Scrapers built on one page work on similar pages within the same domain automatically.
Built for AI agents from the ground up. Works with any LLM framework. Give your agent browser tools.
The core engine is MIT licensed. Run everything locally on your machine. No data leaves your environment unless you choose to use cloud.
steerlabs/opensteer