Home

Google Map Crawler

Google map crawler

Hello folks! I made a google map crawling bot to list up all the restaurants in NYC. Happy to share it with you. I want to contribute to making your lives easier.

This exmaple code crawls Google Maps for a list of restaurants in Staten Island, New York , and saves the results to a CSV file.

You will have names and addresses of restaurants. You can always substitue "restaurant" with any type of place or business that you wanna scrape.

The flow is like this:

Here's the code:

Import Selenium library

            
from selenium import webdriver
import time
import csv
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.common.by import By

            
          

Define a list of locations to search for (in the format of "business type, zipcode, area").

            

 list=[ 
    'restaurant,10301,Staten, New York',  
    'restaurant,10302,Staten, New York',  
    'restaurant,10303,Staten, New York',  
    'restaurant,10304,Staten, New York',  
    'restaurant,10305,Staten, New York',  
    'restaurant,10306,Staten, New York',  
    'restaurant,10307,Staten, New York',  
    'restaurant,10308,Staten, New York',  
    'restaurant,10309,Staten, New York',  
    'restaurant,10310,Staten, New York',  
    'restaurant,10311,Staten, New York',  
    'restaurant,10312,Staten, New York',  
    'restaurant,10314,Staten, New York']
             
             
            
          

Create a csv file to store the scraped data.

            
f=open(r"C:\path\to\save_your_csv_file.csv", 'w', encoding='UTF-8' , newline='')
csvwriter=csv.writer(f)
              
             
            
          

For each search query in the list, the script opens a new Chrome WebDriver window and navigates to the Google homepage. It enters the search query into the search bar and presses enter.

            
for i in list:
driver = webdriver.Chrome()
driver.implicitly_wait(0.5)
driver.maximize_window()
driver.get("https://www.google.com/")



elem = driver.find_element(By.NAME, "q")
elem.clear()
elem.send_keys(i)
elem.send_keys(Keys.RETURN)

driver.find_element(By.CSS_SELECTOR,".wUrVib.OSrXXb").click()
time.sleep(2)  
             
            
          

The script loops through the pages of search results and extracts the business name and address of each result. It writes the data to the CSV file.

            
            for i in range(999):
              
              time.sleep(3)
             
              stores=driver.find_elements(By.CSS_SELECTOR,"div.VkpGBb")
              for s in stores:
                  try:
                      title=s.find_element(By.CSS_SELECTOR, "span.OSrXXb").text
                  except:
                      title="none available"
                  try:
                      addr=s.find_element(By.CSS_SELECTOR, ".rllt__details div:nth-of-type(3)").text
                  except:
                      addr="none available"
                  print(title, "/", addr)
                  csvwriter.writerow([title, addr])
                 
              try:
                  nextpage = driver.find_element(By.CSS_SELECTOR,"a#pnnext>.SJajHc.NVbCr")
                  nextpage.click()
              except:
                  print("Data Scraping Complete")
                  break 
          f.close()