Google Colab
What you basically need is either a Google account to be able to access Google Colab or an installation of Jupiter Notebook. I will use Google Colab because I want to be able to work from multiple devices on my code - it is just way easier on a hosted platform.
Saving is automatically routed to your Google Drive - mapping the drive in your code is a no-brainer. One huge advantage to host simple projects on Colab / GDrive.
Let's directly dive in my use-case and I'll explain my small project. My initial goal was to get detailed information about football results from the Swiss Super League. The first major task was to find a good page to scrape data from. transfermarkt would have been my favorite pick but fbref is way easier to scrape from because the page structure is less complex.
The code should be pretty straight forward - I commented where comments are needed. First we need to import a few things: numpy for mathematical functions, beautifulsoup for parsing html data and of course pandas, for all the data analyzing parts. Google Drive drive mapping is also essential for saving purposes.
A special thanks to the dataquest youtube tutorial that helped a lot for finding easy ways for my problems!
Initial steps
Scraping the data
After having checked the page, the plan was quite simple. Go to the most recent overview page and scrape the team urls and their detailed data. Than jump to previous seasons and redo the steps again until you have enough season data. The outer loop is needed for our team url scraping and the inner loop scrapes the detailed team stats. Should be fairly easy, isn't it?
In the end, we just need to concat our dataframe and export it to our Google Drive as a csv (ok, I don't like upper case headings, that's the reason why lower casing everything before the actual export).
Enjoy the code! Use it as a basis for your project and improve it - let me know in the comments if it was useful.