The web scraping and automation platform

Apifier extracts data from websites, crawls lists of URLs and automates workflows on the web. Turn any website into an API in a few minutes!

See our relaunch as Apify: https://www.producthunt.com/posts/apify-1

Would you recommend this product?
2 Reviews5.0/5
@passingnotes Hey David, thanks! We're considering using a similar model as GitHub - all free accounts have to make their crawlers public and only paid accounts can have them private. Do you think people would be okay with that?
@jancurn I think people creating certain crawlers will be fine sharing, but may want anonymity for community sharing
nice! would love to see a community driven collection of custom crawlers (to simply recycle or emulate)
Apifier is a web scraper that extracts structured data from any website using a few simple lines of JavaScript. For example, imagine you found a website selling shoes and want to get a spreadsheet with all the shoe sizes, colors, prices etc. You could create such a spreadsheet manually using copy and paste, but that would take you a lot of time and frustration. Or you could setup Apifier to do this for you in a few seconds. Apifier is a startup to launch from the YC Fellowship.
Hi Hunters, we’re Jan and Jakub, the makers of Apifier. About a year ago we looked for a web scrapper for one of our consulting projects but none of the existing ones actually worked for websites that we needed. So we decided to build a new one. Unlike the point-and-click web scrapers, Apifier doesn't run into troubles with complicated or dynamic websites and you don't need to work around a new user interface when you start using it - you define your scraper using the same JavaScript code that you already use for your front-end web development. We’re looking forward to hear what you think! We’ll be around to answer any questions.
@jancurn hey guys! I love scrapers =} but the bigger issues of scraping "the websites we need" are around ips/proxies what are you doing on that front?
@gerbz, yeah absolutely, we're actually rotating a number of proxies. We can also arrange for people to use their own list of proxies if they want.