All activity
Scrape once. Know when it changes.
I built Meter because I was tired of re-scraping pages that hadn't moved—and paying for it every time.
Describe what you want in plain English. We handle antibot, proxies, retries, and all the infrastructure you'd rather not build. When real content changes—not ads, not timestamps, not layout noise—you get a webhook.
Teams use it to monitor job boards, track competitor pricing, etc. One team cut embedding costs by 95% by only re-processing what's new.

meterMonitor changes on any website's data without infra overhead
McKinnonstarted a discussion
What type of use cases are you using traditional web scrapers for today?
I'd love to learn more about the different scraping workflows. Our current users are doing everything from VC research, legal analysis, job searching, etc.
McKinnonleft a comment
I've written thousands of scrapers for different use cases, and I always wanted a tool that could go out and do most of the heavy lifting for me. The current tooling is too expensive at scale, and I wanted a tool that was pattern based. Most scraping use cases use the same pattern across many pages within a site, and meter takes advantage of that. As an example, I used to extract forum posts...

meterMonitor changes on any website's data without infra overhead
