Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
If anyone can do this for me, for a smaller sum, including delivering source code, then hit me up. Scrape 5 category pages and grab usual product data.
Yeah, it can run on any server. I run some selenium stuff on Linode. Just set up a normal server, install what you need, and it can run there just like on your own computer.I've figured it out from my Mac so far using Selenium and BS4. Works fine. Not sure how it works on Pythonanywhere.
Yeah, I've recently started using puppeteer. It's quick and easy to learn. You just need some basic javascript knowledge and you are good to go. Did it all in Visual studio code.Why use Python for this?
You would be making the job far simpler using node puppeteer for this.
If you're scraping chances are you have at least a moderate understanding of Javascript its not to hard to learn node or at least get it to the stage where you can run a puppeteer client.
I suggest looking at the underlying source code for json data stores, or looking at the http requests made for the api endpoints. Usually you can skip the entire browser automation stage, which is brittle and has high maintenance cost
There's actually the IMPORTXML function that let's you scrape web pages using xpath as arrays. You can then use array functions to clean the data. A super useful tool, really nice for prototyping. Plus, the requests are made by Google, so you don't have to worry about IPs or proxies.If you want it for Google sheets, I can share a bunch of code.
Yes, I like to use Google apps a scripts though. Allows you to set headers and also manipulate the response.There's actually the IMPORTXML function that let's you scrape web pages using xpath as arrays. You can then use array functions to clean the data. A super useful tool, really nice for prototyping. Plus, the requests are made by Google, so you don't have to worry about IPs or proxies.