Hi Duncan I believe that the focus of what your are looking for would be facilitated by a relational model. You already know what the data points are that you are interested in { age of hvac, age of roof, roof material, age of building, etc.}. The challenge for you is taking unstructured data in the form of receipts, inspections and such and formatting that data for import into a relational model for reporting. If you had 15,000 properties you could do this with a relational db and or excel. The time expended to prepare the data from receipts, accounting records or other would be the same whether the data was to be used in a relational model or a big data model because I imagine that much of this data is not in digital form already.
Big Data would come into play if you had a massive volume of data that you might want to analyze from various sources that would be too large for a relational model and or the rate of change/growth for such data was very high such that a relational model could not keep up (not going to mention Data Warehousing as that is not going to be cost effective for you). The data would have some structure such that Map Reduce jobs could be performed on them in order to aggregate data along sets that you are further interested in analyzing and or using the data from map reduce jobs to then import into a relational model or excel for further analysis. This is not hard and it is cheap. Installing Hadoop clusters on cloud instances within Amazon AWS is not that hard and or expensive. You could do the same on cheap commodity hardware as well. If creating Map Reduce jobs in a language is not your partner's thing then he/she could get started with Hive (sql like for creating Map Reduce).
Big Data would be interesting if you had massive data sets say from state wide tax records or a data feed from a city's inspection reports or a massive data set of historical data from a regional MLS... You can bet your last dollar large REITS, banks and hedge funds are doing this. The trick is getting a hold of the data. If you can do that, sign me up for the data analysis.
Fun fact: Big Data really began with J. Dean's white paper, written in 2004. For those interested the white paper is here http://static.googleusercontent.com/media/research.google.com/en/us/archive/mapreduce-osdi04.pdf
Keep brainstorming on this. Your generalized approach could work but sourcing the data would be the challenging aspect of this. Here is an example of Big Data usage: If you wanted to judge renter sentiment you might follow several twitter feeds that had to do with life in a particular area. The data feed could be pulled using Twitter's api and dumped on a Hadoop cluster and map reduce jobs could run looking for key words. These key words would be aggregated on a daily basis in order to try to gauge sentiment. "I love Dallas", "Dallas sucks, can't wait to get out here", "My landlord Duncan is the man!"