Archive of rental investors in Minneapolis for download

7 Replies

Here are the last 4 years of rental history licensing in Minneapolis. You start to see some interesting things when you 'join' this against the city Assessor file, in regards to how many actual units the Assessor is saying exists, and the number of units being licensed through the city.

It also provides a road map for violations, and the 'tiers' to work off of, changes of ownership... There's just a lot of directions to go with this.

for anyone starting to look at running Machine Learning against property/investor data... There is just a ton here for a Random Forest model when you join the Assessor file. (have fun dialing in those 'weights' for your attributes :) )

Goog was throwing me an error on the sheet sizes... not sure why, nothing should have went over 5 million cells. I had to create two tabbed sheets. They are tabbed at the bottom.

2016-17 tabbed license information

2017-18 tabbed license information

Enjoy. The sheets will be 'live' until Saturday morning, after that I am turning off the 'share' links, so download now if you want them.

Data history is everything.

Updated 12 months ago

Whoops. '2017-18 tabbed ' was supposed to be 2018-19

I'm not really sure of the Milestone definition, as it looks like it conflicts with the license status from https://opendata.minneapolismn...  I did call them a couple of years ago, and did the ' and whats this column mean, and what about this one...' dealy-o. What I left with from that was for 'me' personally, I was interested in the tier status, and the number of units licensed. But that's why I uploaded the complete file history (minus the Geometry column info, because that would just create huuuuge sheets), because everyone has a different approach.

'Status' just looks like it denotes the landlord is paid up and active or not with the license.  Milestone seems to conflict that on some records, so it may be from the past. Milestone almost looks like a sort of 'float'.. ie: I am thinking some of it happened in the past, as a reference

@Jacob Johnson

I get basically 'every' sale statewide, with the contact info of parties involved... The 'front burner' with this is to match 'everything' like emails/phone #'s to the sales that might use a different entity name for the transactions, or even as simple as the same 'owner' address with different name/email/phone. What they're buying/selling. Mapping all of that is on the front burner.

What I'm in the weeds with/end goal is to accurately determine the 'space' available on a parcel for an ADU or carriage house. It can be done... I never cared for math in school,(NOW I CARE) but the more I dig in prospecting in 2020 going forward... the 'just go get a delinq tax/ect list' that every person tells every other person in real estate, is played out. Those lists/directions are beaten up, and it's a really small pond to fish in. (They still work... I found 3 last week off of matching water shut off notices against inventory not receiving mail... it's just you have to DIG with pain points) Today you have to be able to model everything around what the different cities are doing for housing plans...... Finding a person willing to sell, that the parcel has room for an ADU, while also falling within the different zones for the MPLS 2040 plan AND Opp Zones? That's a score for me. I know people that want that :), I have found some of them from my opening paragraph in this post.

This can be done though. It's just 'Geographically Weighted Regression'. You can run models against county parcel sets. The big commercial groups do it. It's really about finding really really clean parcel/property data. Little things like below/above ground sf can really throw things off too. I'm just using the basic parcel dimensions from various muni GIS files and Microsoft's awesome building footprint file for the building footprints on the parcels. I'm still way off on determining the difference right now unless I focus on more recent build years though.

I've only been playing with Machine Learning for the last 6 months or so, and it's really not an 'every day' thing to try to learn something new with it. 5-6 hours a week? It's just once you see that the best tools are Open Source and basically free, even when you plug into Cloud services, it's hard to not want to poke around and try to figure things out.

that's a very creative use of the data! I did not know that Microsoft provide the building footprint data on Github, that's very handy. Good luck on the project, I hope you find a way to getting it working for you.