Listing & General Agents
This workflow automates data extraction from real estate listing websites using a two-agent approach.

This example workflow uses two agents:
-
Listing Agent: Collects property URLs from search results pages
-
General Agent: Scrapes each property URL and extract detailed information
Step 1: Set up All Agents
Create and configure the Map Agent, Listing Agent, and General Agent so each one is ready to perform its specific role in the workflow.
Set Up the Manual Trigger
- Add a Manual Trigger node called "When clicking 'Execute workflow'".
- This allows you to run the complete three-agent workflow on demand.
Load Listing Agent Configuration
- Add another Google Sheets node called "Get Listing Agent Scraper".
- Select Read Rows or Lookup operation.
- Authenticate with your Google account.
- Select the Google Sheets file that stores your Listing Agent Scraper ID and target URL.
Note
If you have not yet created a spreadsheet containing scraper IDs and target URLs, refer to the
Create a Scraper guide to configure your Google Sheets.
- This node reads the Listing Agent scraper ID and target URL from your sheet.
Load General Agent Configuration
- Add a third Google Sheets node called "Get General Agent Scraper".
- Connect it after the "Get Listing Agent Scraper" node.
- Select the Google Sheets file that stores your General Agent Scraper ID and target URL.
Note
If you have not yet created a spreadsheet containing scraper IDs and target URLs, refer to the
Create a Scraper guide to configure your Google Sheets.
- This loads the General Agent scraper ID for extracting property details.
Step 2: Run the Listing Agent
Execute the Listing Agent to scrape search results and extract property URLs.
Run the Listing Agent
- Add the MrScraper node called "Run listing agent scraper".
- Select Listing Agent as the operation.
- Configure using values from Google Sheets:
- Scraper ID:
{{ $json.listingScraperId }} - URL:
{{ $json.listingTargetUrl }} - Max Pages: Set how many result pages to scrape (e.g., 2-5)
- Timeout: 720 seconds
- Scraper ID:
Extract All Property URLs
- Add a Code node in Python called "Extract All Url".
- Parse the listing response to collect all property URLs:
items = []
payload = _input.item.json
urls = set()
response = payload.get("data", {}).get("response") or []
for page in response:
listings = page.get("data", {}).get("data") or []
for listing in listings:
url = listing.get("url")
if isinstance(url, str) and url.strip():
urls.add(url)
for url in urls:
items.append({"json": {"url": url}})
return itemsStep 3: Process Property URLs with the General Agent
Execute the General Agent for each property URL to extract detailed information.
Loop Through Properties
- Add a Loop Over Items node.
- This processes each property URL one at a time in batches.
- Configure the batch size based on your needs.
Run the General Agent
- Add another MrScraper node inside the loop called "Run general agent scraper".
- Select General Agent as the operation.
- Configure using the scraper ID from Google Sheets:
- Scraper ID:
{{ $('Get row(s) in sheet1').item.json.generalScraperId }} - URL:
{{ $json.url }}
- Scraper ID:
- This extracts detailed information from each property page.
- Connect this node back to the loop to continue processing.
Step 4: Export the Results
Send the scraped data to Google Sheets and notify via email.
Flatten the JSON Data
- Add a Code node in JavaScript called "Flatten Object".
- Convert nested JSON into a flat structure for easier export:
function flattenObject(obj, prefix = '', result = {}) {
for (const key in obj) {
const newKey = prefix ? `${prefix}_${key}` : key;
const value = obj[key];
if (value === null || value === undefined) {
result[newKey] = null;
} else if (Array.isArray(value)) {
result[newKey] = value.length ? value.join(', ') : null;
} else if (typeof value === 'object') {
flattenObject(value, newKey, result);
} else {
result[newKey] = value;
}
}
return result;
}
const items = $input.all();
return items.map(item => ({ json: flattenObject(item.json) }));Save to Google Sheets
- Add a Google Sheets node called "Append row in sheet".
- Select Append Row operation.
- Authenticate with your Google account.
- Select your destination spreadsheet and sheet (can be different from your configuration sheet).
- Map the flattened data fields to your columns.
Send Email Notification
- Add a Gmail node called "Send a message".
- Configure:
- To: Your email address
- Subject: "Property Scraping Complete"
- Message: Include summary or link to the spreadsheet
- This notifies you when scraping is finished.