Skip to content
×
Pro Members Get
Full Access!
Get off the sidelines and take action in real estate investing with BiggerPockets Pro. Our comprehensive suite of tools and resources minimize mistakes, support informed decisions, and propel you to success.
Advanced networking features
Market and Deal Finder tools
Property analysis calculators
Landlord Command Center
ANNUAL Save 16%
$32.50 /mo
$390 billed annualy
MONTHLY
$39 /mo
billed monthly
7 day free trial. Cancel anytime

Let's keep in touch

Subscribe to our newsletter for timely insights and actionable tips on your real estate journey.

By signing up, you indicate that you agree to the BiggerPockets Terms & Conditions
Real Estate Technology
All Forum Categories
Followed Discussions
Followed Categories
Followed People
Followed Locations
Market News & Data
General Info
Real Estate Strategies
Landlording & Rental Properties
Real Estate Professionals
Financial, Tax, & Legal
Real Estate Classifieds
Reviews & Feedback

Updated almost 5 years ago on . Most recent reply

User Stats

110
Posts
60
Votes
Bryce DeCora
  • Investor
  • Snohomish, WA
60
Votes |
110
Posts

Washington Data Scraping API - Anyone Else Doing This?

Bryce DeCora
  • Investor
  • Snohomish, WA
Posted

My original plan was to invest solely in Snohomish and King County Washington.  I wanted my custom CRM (built with Podio) to have the ability to auto-populate property data found on the county assessor site.  This drove me to create an API endpoint written in Python that would do this on demand.  I've met others who build code like this and store the data in tables to be queried later, but then you're stuck with older data.

Is anyone else using property scraping solutions on a large or small scale?  As my company grows, I think I will start using a company like ATTOM Data to provide property data so I don't have to continue to coordinate the build and maintenance of scraping tools.  Fellow nerds let me know what you're up to!  

Most Popular Reply

User Stats

78
Posts
63
Votes
Gavin D.
  • South Carolina
63
Votes |
78
Posts
Gavin D.
  • South Carolina
Replied

Yep... I've got bs full of assessor/auditor/recorder data.    To keep from having older data, you use "insert on duplicate key update" queries and continually loop over all the sources data.....  i created a threaded version of rolling curl in php that keeps about 150 curl threads running async on targets which collects up to about 230 html pages per second.. which pulls down every property large counties like dekalb pretty fast...

 Confused though.... What did you create the API endpoint for? and where? isn't your script scraping and writing to Podio API?  

If ATTOM has you county that works...   Plenty of counties are missing though.  custom scrape is the only way to get everything for those....

Loading replies...