Skip to content Skip to sidebar Skip to footer

Writing And Saving Csv File From Scraping Data Using Python And Beautifulsoup4

I am trying to scrape data from the PGA.com website to get a table of all of the golf courses in the United States. In my CSV table I want to include the Name of the golf course ,A

Solution 1:

All you really need to do here is put your output in a list and then use the CSV library to export it. I'm not entirely clear on what you are getting out views-field-nothing-1 but to just focus on view-fields-nothing, you could do something like:

courses_list=[]

for item in g_data2:
   try:
      name=item.contents[1].find_all("div",{"class":"views-field-title"})[0].text
   except:
       name=''try:
      address1=item.contents[1].find_all("div",{"class":"views-field-address"})[0].text
   except:
      address1=''try:
      address2=item.contents[1].find_all("div",{"class":"views-field-city-state-zip"})[0].text
   except:
      address2=''

   course=[name,address1,address2]
   courses_list.append(course)

This will put the courses in a list, next you can write them to a cvs like so:

import csv

withopen ('filename.cv','wb') as file:
   writer=csv.writer(file)
   for row in course_list:
      writer.writerow(row)

Solution 2:

First of all you want to put all of your items in a list and then write to a file later in case there is an error while you are scrapping. Instead of printing just append to a list. Then you can write to a csv file

f= open('filename', 'wb')
csv_writer = csv.writer(f)
for i in main_list:
    csv_writer.writerow(i)
f.close()

Post a Comment for "Writing And Saving Csv File From Scraping Data Using Python And Beautifulsoup4"