Guides / Scrape Google Maps

How to Scrape Google Maps in 2026

Google Maps is a JavaScript single-page application with obfuscated CSS class names and dynamically loaded content. Business listings, ratings, and reviews all require full browser rendering to access. With Browser7, you get fully rendered Google Maps search results in a single API call.

What makes Google Maps hard to scrape

JavaScript single-page application

Google Maps is entirely JavaScript-rendered. A simple HTTP request returns no business data at all. You need a real browser that executes JavaScript, loads map tiles, and processes the internal API responses to see any listings.

Obfuscated CSS classes

Google uses generated, obfuscated class names that can change between deployments. Selectors like div.Nv2PK and a.hfpxzc work today but may shift. Using aria-label attributes and structural selectors provides more stability.

Dynamic loading and infinite scroll

Google Maps loads business listings dynamically as the user scrolls through the results panel. The initial render shows a limited set of results, with more loaded on demand. There is no traditional pagination with page numbers.

Scrape Google Maps business search

Browser7 handles proxy rotation, browser fingerprinting, CAPTCHA solving, and JavaScript rendering automatically. This example searches for restaurants in New York and returns the fully rendered HTML with business listings.

from browser7 import Browser7

client = Browser7(
    api_key="b7_your_api_key",
    base_url="https://ca-api.browser7.com/v1"
)

result = client.render(
    "https://www.google.com/maps/search/restaurants+in+new+york/",
    country_code="US",
)

print(result.html)

That is the complete code. No proxy configuration, no browser setup, no CAPTCHA handling logic. The response contains the fully rendered HTML of the Google Maps search results, including business names, ratings, and details.

Data you can extract

The rendered HTML contains all the data Google Maps shows to a real visitor in the search results panel. Common data points to extract:

Business details

  • Business name
  • Business category
  • Address
  • Phone number
  • Website URL

Ratings and reviews

  • Star rating (1-5)
  • Total review count
  • Price level ($, $$, $$$)
  • Review snippets
  • Popular times data

Location data

  • GPS coordinates
  • Google Maps place ID
  • Neighborhood
  • Distance from search center
  • Directions link

Operational info

  • Opening hours
  • Open/closed status
  • Service options (dine-in, delivery)
  • Accessibility features
  • Photos and thumbnails

Complete example: render and parse business listings

Here is a complete example that renders a Google Maps search page and extracts structured data from the HTML. The Python example uses BeautifulSoup, Node.js uses Cheerio, and PHP uses DOMDocument - the standard HTML parsing approach for each language.

from browser7 import Browser7
from bs4 import BeautifulSoup
import json

client = Browser7(
    api_key="b7_your_api_key",
    base_url="https://ca-api.browser7.com/v1"
)

result = client.render(
    "https://www.google.com/maps/search/restaurants+in+new+york/",
    country_code="US",
)

soup = BeautifulSoup(result.html, "html.parser")

businesses = []
for card in soup.select("div.Nv2PK"):
    biz = {
        "name": None,
        "rating": None,
    }

    link = card.select_one("a.hfpxzc")
    if link:
        biz["name"] = link.get("aria-label", "").strip()

    for span in card.select("span"):
        text = span.get_text(strip=True)
        if text and len(text) < 5:
            try:
                val = float(text.replace(",", "."))
                if 1 <= val <= 5:
                    biz["rating"] = text
                    break
            except ValueError:
                pass

    businesses.append(biz)

print(json.dumps(businesses[:5], indent=2))

CSS selectors may change if Google updates their page structure. Inspect the current page if any fields return null.

Sample output:

[
  {
    "name": "Le Bernardin",
    "rating": "4.7"
  },
  {
    "name": "Peter Luger Steak House",
    "rating": "4.4"
  },
  ...
]

Take a screenshot of search results

Capture Google Maps search results as images for location intelligence dashboards, competitive analysis reports, or tracking business listings over time.

import base64
from browser7 import Browser7

client = Browser7(
    api_key="b7_your_api_key",
    base_url="https://ca-api.browser7.com/v1"
)

result = client.render(
    "https://www.google.com/maps/search/restaurants+in+new+york/",
    country_code="US",
    block_images=False,
    include_screenshot=True,
    screenshot_full_page=True,
    screenshot_format="png"
)

# Save the screenshot
with open("googlemaps-search.png", "wb") as f:
    f.write(base64.b64decode(result.screenshot))

print("Screenshot saved")

What this costs

Every Google Maps page render costs $0.01 - the same as any other website. Residential proxies, JavaScript rendering, CAPTCHA solving, and screenshots are all included. There are no per-domain surcharges, no credit multipliers, and no bandwidth fees.

10,000 Google Maps search pages costs $100. You know this before you start, not after.

Try it yourself

100 free renders - enough to test Google Maps scraping with no payment required.