No more diving into complex DOM structures or writing fragile XPath expressions. Just specify what data you are scraping from the web with natural language-like queries, and AgentQL handles the rest.
Universal-3 Pro by AssemblyAI — Speech-to-text that finally understands context
Speech-to-text that finally understands context
Promoted
Behind the sleek interface of this newly launched AI software product, there's a robust and sophisticated support system powered by advanced language models. It combines simplicity with cutting-edge technology, ensuring both ease of use and exceptional performance.
Excited to see AgentQL tackling such a significant challenge in web automation! The ability to use natural language for element identification could really streamline workflows. Kudos to @shuhao_zhang and the team for this innovative approach. Can't wait to give it a try!
Let us know what you think! We hope we can make many developer lives easier with this approach that's both easy to stand up and easy to maintain while remaining accurate!
@dance17219 Thanks for the support! We’re excited too, especially about how using natural language for element identification can streamline workflows. Kudos to the team indeed! We can’t wait for you to give it a try!
Report
Wow, AgentQL looks like a game-changer for web automation! 🚀 As someone who's struggled with maintaining web scraping scripts, I'm really excited about the potential of using natural language commands instead of rigid selectors. The ability to adapt to UI changes automatically is huge.
I love how you've combined LLMs with DOM processing to create a more flexible and resilient solution. The use cases for this seem endless - from e-commerce price tracking to workflow automation.
Definitely going to check out the website and join the Discord. Can't wait to see how this evolves and potentially integrate it into my own projects. Great job, Shuhao and team! 👏
Hi Charles! Sure thing it is. We hope AgentQL can remove the weight of scrapers maintenance from your shoulders. Try out our Quick Start guide and come chat about your usecase in Discord!
@charles_ Thanks so much! We’re really excited to see the potential of AgentQL resonating with you. Making web automation more flexible and resilient with natural language commands is exactly what we’re aiming for. Looking forward to having you in the Discord and hearing how you integrate it into your projects. Appreciate the support!
That's actually spot on for what inspired us! Excited for you to give this a try, and definitely let us know what you think and how we can improve this!
@peter_papp Thanks! That’s a great way to think about it—AgentQL does bring some of that GraphQL-like flexibility to web content. We appreciate the pin, and we’re excited for you to try it out!
Report
I am new to web scraping and front end development. AgentQL seems very promising to me because it will lower the barrier to entry for effective and reliable web scraping. It will let newbies like me quickly setup and and get the data we need without getting bogged down in micromanaging elements and edge cases.
Is the free tier just for testing or can you run a small personal app with it like a web scraper for stock/financial news and information?
Hi Alex, glad to hear you’re considering it for your personal project! Using AgentQL for stock/financial news sounds like a great fit. The free tier is perfect for a personal assistant that runs dozens times a day—it offers 1,200 free API calls per month, no credit card required. Feel free to give it a try, and we’d love to hear your feedback!
Yes, Alex, I believe it will help you to spin up and test a data extraction endpoint in no time! As for the free tier, it'll highly depend on your usage pattern — how many distinct requests will you be making to gather all the needed info. I.e. can you grab all the needed stock info from one page or should you perform navigation between pages and multiple requests to gather a single set of data in focus, how many times a day you need to execute the whole scenario, etc. 1200 API calls basically means 1200 calls to either `get_by_prompt` or `query_elements` or query_data` methods in your running code.
AgentQL
AgentQL
AgentQL
promotions.fyi
AgentQL
AgentQL
AgentQL
AgentQL
AgentQL
AgentQL
AgentQL
Fleak
AgentQL
AgentQL
AgentQL
AgentQL