How to Scale your Real Estate Business with Data and Analytics
Want to scale your business with systems and automation, but not sure how? Look no further! In this video, I will cover the different types of freelancers you can hire for data tasks. Stick to the end for a walkthrough using a BiggerPockets example.
Transcription
Ariel Herrera 0:00
Are you looking to scale your real estate business with data and analytics, but you're not exactly sure how to get started? Where to hire it out to? And what are some of the benefits? Well, don't worry, because I'm going to cover all of this within a few easy steps. My name is Ariel Herrera with the analytics area, our channel where we bridge the gap between real estate and technology. I have seven years of experience in the data analytics space, from a data analyst position all the way to data science. So I know exactly what are some of the skills that you may need for your business. This is the type of content that you enjoy them, please like it as well subscribe to this channel. So you get the latest tips and tricks for tech within real estate. Alright, let's get started. Cool. So right now we're looking at the data collection process page, a document I put together based on a lot of frequent questions that I got from investors, agents, appraisers, and others in this space, one of the number one questions I get is, How do I get data from my county website from a social media site, web, scrape it, aggregate it and then understand where some of my new leads can come from? Well, actually, this is a lot simpler to be able to implement and can be done all by the person who wants to do it, even without any technical skills, I'm going to quickly walk through this document God and then actually show you how you hire out people on Fiverr, to do some of these tasks for under $100, and sometimes even at $5. Alright, so starting with the first part is overview of collecting data. So this is intended for a non technical audience. If you want to be able to go through this document on your own, feel free to click the link below to get the actual link for this. So the background here is that real estate entrepreneurs want to be able to get data for their local area. And that's what makes real estate really special, because it is hyper localized. So one zip code may have more opportunities in terms of growth, location, rentals, job growth, as well, then the next door zip code. So it's hyper localized, which means you need to have a lot of information, especially if you don't have the foots on the ground if you're say like an out of state investor. So to get some information that's really useful, is being able to get news data, as well as being able to get information on people who may be more likely are motivated to sell their property. So there's a bunch of tools that help you to start to get lists of these motivated sellers. You could use tools like prop stream, as well as list source to be able to extract list of motivated sellers. However, the caveat here is that because, you know, pretty much all investors have access to these tools if they have the funds. What advantage is it if everyone else has access to the same data? Well, one being really hyperfocus helps, but to bring being able to bring multiple data sources, so not just one Excel file, think of say five Excel files, where you can overlap them and determine where some hidden opportunities is really the value. So how do you acquire these datasets that are not easily available? First to understand why you should hire out automation, and not just give it to a VA simply. So I've heard of investors who are trying to get started and they offload a lot of things to vas, including hiring VAs have virtual assistants offshore for a low cost of money to click through websites and document data. Well, the reality is we live in 2021. So if we have Python skills and automation tools to be able to do this automatically on a repeatable basis that you could put your VAs two more important tasks like actually cold calling, analyzing or helping to get data to analyze a deal, and more not just clicks as well. When you're offloading certain work to vas, like writing down things that are on a website. Humans are more prone to errors. So you can have mislabeling or are missing data as well. And as I said before, it could be a waste of resources. So one of the key things to understand our water, some common technical roles to begin with. So for one web scraper, what is actually a web scraper. Say if you want to get data on the internet, have it in a consumable fashion like an Excel file. What you could do is hire someone out to basically take that text and bring it to you in an easily consumable format. And the person who does this is a web scraper. It's relatively a low skill job. So someone
who has basic Python knowledge can understand how to read and follow sir and blogs would be able to replicate a lot of these tests. That really translates to mean that you can hire a web scraper to get data off of foreclosures, county websites, things like that for relatively low cost. And some of the common languages they use are Python and JavaScript, it's a little bit out of scope for me to go into what the differences between the two are. But for what I could tell you is Python is preferable in terms of being able to use for all purpose to being able to manipulate datasets communicate with the web, all sorts of techniques, whereas JavaScript is a little bit more for like web server a little bit more front end. Okay, so now you have your data, you got a web scraper to give you the data. What if you want to be able to analyze this, say on a website, you want to be able just to click the web scraping tool goes, and then you get an Excel file back, while the person who creates web pages, things of that nature, are usually full stack web developers. So full stack, meaning front end, they create the visual where you could go on the website, click through it, and then back end, they're able to handle like housing that data so that in the future, you could go back and say, Oh, what was that dataset of foreclosures, I looked at February 1 of 2022. Well, a full stack developer has a knowledge of having that database. So you'd be able to have a holistic app there. The skill sets are pretty much a medium to high level of difficulty, you could definitely find a lot of offshore availability for full full stack web developers. Depending on the complexity, it could be as low as say 250 bucks, all the way to a couple $1,000. But still, really great way to get started, if you want to have a customizable website or some tools.
A use case here that I have is say if you want to build a website that analyzes properties, where user just puts in 122 Main Street and get some information back, that would be an example, when you need a web developer to create that whole process, not just a scraper. There's two more here that I have, which is great and touch upon. So data analysts and data scientists. Now if you have all this data, you're not sure how to maybe rank the best leads, because maybe you have at this point 10,000 leads, but you don't have the funds to send mailers to all these leads. Well, how do you concentrate on the leads that are the most motivated to sell? Well, the person who can help you aggregate all that data is a data analysts, they could create a dashboard for you that you could go into daily or weekly, see the top properties, see trends in the area and understand if you need to make actions or pivot. Now if you want to do something like forecasting, kind of how Zillow does in terms of estimates, and being able to say home value forecasts, in general, that's where you would get a data scientists who create that proprietary data for you. Great. So now you have the knowledge of the four different types of roles that work with data, which is web scraper, full stack developer, data analysts and data scientists. Once you bring them in what the skill level is, you could start to be able to hire out the services individually. However, if you're still unsure of where to get started, and who you need to reach out to for each point, that's where I come in. Since I have a lot of knowledge of in this data analytic space, I come in as a tech consultant. So you can feel free to reach out to me directly on my website and set up some free time to discuss. But going back now say if you know what to do, what is the actual process that you take. So I'm going to go through an example. Using bigger pockets data. We're at the bigger pockets rental calculator, which if you want a full overview of it, please go to my video linked below, where I actually go through the tool and pros and cons. For this purpose, I want to go through a manual process to show how you would use a web scraper to be able to automatically do this for you. So from the bigger pockets run calculator, if you go to zip code, you can enter in a zip code within a search bar, select number of beds and baths in this case, let's imagine in South Amboy are buying a property for three bed two bath or considering to buy a property, we want to understand what the rent would be to see if we actually be able to cash flow. We could see here what the median rent is, what the confidence score is. And then we get some other rich information below too. Well, it would be really hard to do this for say 10,000 properties back to back right, you would definitely take up a lot of time and resources. So this is a great example of how web scraping comes into play. Now going back to the process, step one is defining your goal. So in this case, our goal is to be able to get rental data for list of properties, property addresses, and we want to be able to do that offer the bigger pockets website. So that's number two as well, it's basically coming off the web. Number three would be what are the different inputs and outputs that I'll be putting in? Am I going to be inputting a zip code? Or am I going to be inputting an address, that's important to get those specifics down, then for output, what do I actually want back. So in this case, for the bigger pockets, rental calculator, I want to be able to get the median rent, I want to get the confidence score as well. And then maybe some of these comparable properties too. So the fine that now the next step is critical. This is your requirements document template. For those who are in the tech space, you'll be familiar with these kinds of documents. But if you're not, that's okay, too. They're easy to put together. So in this example, what I would do is put your company name into Google Docs, you could take this template in the links below. So put your company name, your web scraping, what you're looking to do, in this case, it's bigger pockets, and then list your requirements below. So I have this put into three different sections. So easy. So the parameters, visitor things that I'm going to be inputting into the website would be the street address, the number of bedrooms and bathrooms, as we saw, and what I expect to get as an output is that median rent. And then in some cases, bigger pockets provides the low rent and high rents. So that range, and then also the median rent, if it's available for one bedroom, two bedroom, three bedroom, four bedroom for that zip code.
Then the next section is process. So I basically just take screenshots, if you're on a Mac, just do Shift Command for you take screenshots, and then you add little either text or squares around what the exact steps would be, please, please make this as baby proof as possible. So very simple, go to BiggerPockets homepage, step two, click login, step three, put in the email put in the password. So this is what someone would manually do. If they were to go and do the step by step, you would then go to the property report, you would put in the address and the parameter. And these are the exact fields that I'm looking to get back. And I have squares showing exactly what are those fields? How many of those fields. So the expectation is this manual process of recording this information will be web scraped. And in the case of someone like me, since I have the programming skills, I want the code for this and Python, someone like you, you may not want the Python code, you may just want the Excel file, and maybe communicate with the web scraper, hey, if I pay you $5 Weekly, would you be able to run the same script for me, a lot of things are negotiable in the fiber space. So now that you have this document showing the inputs, outputs, the manual process, you will then go to Fiverr. Or you go to Upwork as well, or your shot at Craigslist to and you would go to your services. And this case, we're looking for a rep web scraper. So you would type in web scraper, go to search. And what I normally do is I'll go through people's profiles, look to see what do they include this person includes extensions don't really need that. But they also have database skills, which is awesome. So I might contact the seller, get a quote. And then attached. This is the part where you attach your requirements document, you basically say, What is your goal, which you said before, in step one, define your goal, you have your requirements document, please provide me a quote on how much this would cost, get this quote between two or three different fiber, freelancers and then select one. This is someone that I highly recommend. He's done web scraping for me in the past and has done a fantastic job with a very quick turnaround. Now, if you're looking to actually web scrape and follow these steps, for bigger pockets, it actually doesn't work. They have a high level of security when it comes to detecting bots going in and out of logging into their website. But when it comes to county websites, they don't have as much security so it's a lot easier to be able to get this done. Now the very last step of this is you select your freelancer, they're able to either provide you code or the Excel file back you review it and what's great about Fiverr is you can go back and forth. So just because they sent you something doesn't mean it's final, you can actually get revisions as well and work with this Freelancer in the future for more upcoming projects. As a heads up something like this. The Cyber project for web scraping should be between 20 to $40 which is really, really great because you're able to offload that to an expert for a really low costs and be able to automate and systematize your business. So this is the kind of content you enjoy and getting tips and tricks to systemize your business and improve your space within real estate then please subscribe to this channel as well like this content thanks