How Web Scraping Improved My Finances
Helps to reduce unnecessary expenses
Date: 2024-10-10
Author: Javier Medina
Introduction
Web scraping is a powerful technique that has allowed me to automate data collection and make informed decisions when shopping. I developed a project using JavaScript, leveraging Puppeteer to scrape supermarket deals in my area, Express to create an API serving the data, and technologies like GitHub Actions for automation and Azure for backend deployment. The frontend was built with Vite.js and deployed on Vercel.
This project not only helped improve my personal finances but also became a sellable idea for others or businesses looking to optimize their expenses.
Project Details
1. Web Scraping with Puppeteer
My project starts with the automatic collection of daily deals from supermarkets like Jumbo, Carrefour, and Dia. I use Puppeteer, a tool that allows me to interact with web pages and extract the information I need, such as product prices and promotions.
2. Backend with Express
All the scraped information is processed and exposed through an API created with Express. This API organizes the data and serves it in JSON format, making it easy to use for both my application and other services.
3. Automation with GitHub Actions
The magic of the project lies in automation. I use GitHub Actions to run my scrapers automatically every 12 hours. This way, I always have the most up-to-date information without having to do it manually.
4. Deployment on Azure
I deploy the backend on Azure, which provides stability and scalability at a very low cost, taking advantage of the free credits offered to students. This cloud solution ensures that the API is always available.
5. Frontend with Vite.js and Vercel
For the frontend, I chose Vite.js for its speed and simplicity. The entire interface is deployed on Vercel, allowing me to have a lightweight and fast application to visualize the deals.
How It Improved My Finances
Thanks to this project, I can see all the available deals in the supermarkets in my area in real-time. This has allowed me to:
- Compare prices quickly and efficiently.
- Make smart decisions about where to shop to save money.
- Avoid unnecessary trips, as I know exactly which supermarket to go to.
Additionally, this idea has the potential to scale. I have considered offering this service to small businesses that want to optimize their purchases or even to consumers looking to save on their daily shopping. What started as a personal tool has become an interesting value proposition for others. If you wanna see the more details about my project, you can check out my GitHub repository.
Conclusion
Web scraping is a powerful tool not only for automating tasks but also for making more informed decisions in daily life. My project has improved my finances and opened doors to new opportunities to sell this solution to others.