How to Get Commercial Real Estate Data | Browse AI
In this video, I'm going to give you a step by step guide to get multifamily listings in any city, we will use web scraping with absolutely no code.
Ariel Herrera 0:00
How to Get multifamily real estate data from loop net. A multifamily home is a single building that's divided to accommodate more than one family living separately. This can be a duplex, or even 100 unit apartment complex. For residential multifamily buildings with four units or less, we can typically find them on sites like Zillow. But what if we want to find data on large apartment complexes, we can get this information from sites like Luke net. In this video, I'm going to give you a step by step guide to get multifamily listings in any city, we will use web scraping with absolutely no code. My name is Ariel Herrera, your fellow data scientist with the analytics area channel where we bridge the gap between real estate and technology. I'm passionate about providing data solutions. So follow my channel to get the latest content of real estate analytics. And stay tuned to the end where I provide you a preview on how we get property details automatically into a spreadsheet. All right, let's get started.
LoopNet is an online marketplace for commercial property. It primarily focuses on commercial property listings for sale and for lease in the United States. It is currently owned by commercial property data company costar. The way to use LoopNet is to enter in the property type you're interested in and enter your location to see all of the listings, we're going to actually extract the listings into a spreadsheet automatically using browser AI. What is browser API browser API is a web automation software that learns to perform data extraction, monitoring and automation tasks on the web. Simply by observing someone perform the actions just once. Browse, I record your actions as you move through a web page. It can scrape many sites including social media, county information, real estate websites and more. Browse AI has a free amount of credits per month, use the link below to receive a 10% off if you later sign up for a plan. Here, I will click login. Since I already have an account at the homepage or browse AI we have two options, browse pre built robots and build a new robot. Now if you see my prior videos, I've shown how to actually use the pre built robots to get Zillow data. But for our use case, these larger commercial properties will not show up on Zillow. Therefore, we're going to use build a new robot for loop net. Here we have two options, extract structured data and monitor site changes. In our case, we just want to get all of the data into a spreadsheet. We don't care as of yet, if we want to monitor for new listings or not, so let's select the first one. Next, we enter in our origin URL. If we go back to loop net, and we copy loop nuts URL, we can then paste it into the space here. Next, we start recording our task, we now see a pop up box and browser is telling us that it is now going to record all of our actions. We only need to do this once and browser I will then be able to get data for us. So let's select okay understood. Now we're going to select the property type that we care about, in this case multifamily. Next we're going to enter a location which is going to be Phoenix, Arizona. Next, we're going to click search. Next, you're going to see on loop nuts homepage, two different views. On the left hand side we could see all the multifamily properties that are currently for sale. And on the right hand side we get the detail of these apartment listings. If we scroll all the way down to the bottom, we can see here that we have 87 listings in total. What we're going to do next is select Browse AI on the top right hand corner, then we're going to click Capture list. This is going to allow us to capture all of our listings, all we need to do is hover over one of the listings, and then click our mouse. Here our next step is to select which fields we care about. So if you hover over the first one, which is St. It's going to ask us if we want to capture the text, the link or the title, and this case, we're going to capture each one so let's start with the text. And we're going to click back to St get the link and then get the title. Next we're going to select the city price, cap rate and the size of the apartment building. Once complete, we're going to click Enter to finish. Now we could name each of our fields
Once we enter in all of our fields, we then see a capture list. This looks like a spreadsheet, we have a row for each listing. And the information that we contain is Street, the unique link, title, city price, cap rate, and apartment size. Now we can name our list. Next, browse AI needs to know is there more data to extract. In this case there is, so we're going to click on next and navigate to the next page. I'm going to select the second page right here. And our last step is to set what our maximum number of rows we want to extract. Here, we're going to go for a custom number and set this to 40. Now I can capture the list, we just wrote, browse AI how to capture a list for us automatically and how to get that data into a spreadsheet. We can click OK understood, and we've completed our task. So let's click Browse API. And click finish recording. We have two quick steps to do next, first name our robot, which in this case, I'm going to leave it as the default and click save. Now we're going to view our results. In order to do this, the robot is going to run our task on a cloud server. What that means is that we are not actually running this task locally on our machines. This is being done on a server elsewhere. And it's going to simulate our actions that we just took, such as clicking a listings getting all of our fields that we care about. And it's going to present us a view as to what the results will be each time the web scraper is run. Once a robot is complete, you'll see all of your data in one spot, we can see that we have street link, title, city, price, cap rate and apartment size. These first two rows don't have the data that we expect. Because these were the advertisement apartment complexes. However, we do get the remaining 40 items, which was our cap that we wanted to extract. And if we go down to the bottom, we could see the final screenshot that browser I took, as well as all the steps that it was able to replicate. Now we can select yes looks good. At the homepage of our bot, we could now run a task. In my scenario, I actually had a pause when I was typing out Phoenix, Arizona, therefore I have two separate locations. So I'll have to recreate my bot just to fix this. But in your use case, you should just have three different parameters here, the origin URL, the location, and your limit, you'll be able to put in different cities and run the task as is. And you'll be able to receive a spreadsheet of the listings. Now how do you actually view these listings, if you go over to history, every time you have a successful run, you'll be able to click on that and view that spreadsheet, you can then download it to a CSV, you can also monitor for a particular city so that you get alerts or you have new rows to your spreadsheet every time there's a new listing. Now what's really exciting is being able to integrate your bot with other applications. For example, if you wanted to set it up that every time the spa is run, you could have this go straight into Google Sheets, and you could add your own formulas to automatically analyze your deal. One of the limitations that we did have with our data is that even though we have listings, we don't actually have the property detail. If we look at this first example here 1402 W Polk Street, we get information on the price number of units square footage, when it was built, and some notes as well. But if we click here, we could see there's a lot more information we have investment highlights executive summary, some that's actually shaded unless we go pro and the part that we really want, which is property facts. And the next video, I'll show you how to get this as well using Browse AI. And in part three, we'll be able to consolidate this whole package either using Zapier or Python programming so that we can automatically get listings and get property detail for each of these listings. Please leave comments below on how you extract data for commercial real estate properties. And if you haven't already, please subscribe. Thanks
Transcribed by https://otter.ai