How to Get Airbnb Data for Your Market using Apify

Want to analyze airbnb listings in your market? Check out this video, where we use the Airbnb Scraper on Apify to get short-term listings data for free.

Ariel Herrera 00:00

Are you considering getting into short term rentals? You want to be an Airbnb host, but you're not sure how to get the data to analyze your potential market. Well, no worries because in this episode, I'm going to show you a no code solution to be able to extract Airbnb data using apify. My name is Ariel Herrera, the analytics area our channel, we bridge the gap between real estate and technology. If you enjoy data driven solutions, then please subscribe to this channel as well. If you want to see more tips on short term rentals, then like this content, so I know to make more of it. Alright, let's get started. This tutorial is going to be split up into two parts. On the first end, we're going to build our web scraping bot with no code using apify in order to get data for our market and Airbnb. Next, we're going to get that data and be able to transform it and actually analyze the listings of inner market. That way we can answer questions like what is the average nightly rate? What's the median number of guests? And where are these properties actually located, please feel free to also follow along in my step by step guide on the medium here I detail each part of setting up your Airbnb scraper within apify. And then secondly, how to actually analyze your data with Python and ultimately be able to visualize all the properties in one setting. So what is apify amplifies the most powerful as said via the company web scraping automation platform, it's ready to use tools to get the job done fast and accurately. And I've actually use apify on several occasions. First when they used to have the Zillow apify scraper I used to use it to get for sale by owner and is also so many other solutions that they have. So apify allows independent developers to create web scrapers and then put it on their apify store. Therefore you have people who are looking at this data on a regular basis, which is why I highly suggest not to build your own custom scraper. But if there's already one out there utilize it since these are people who are actually tracking to see if any changes are occurring on those high traffic sites like Airbnb. As you could see apify has a lot of different categories including business, e commerce, marketing, SEO tools, social media, which let's take a step here. We can get information on Instagram profiles, scraper tick tock scraper Reddit scraper, which can be useful if you're looking at Wall Street baths. And there are also as information on real estate too. So if you're looking to get data on realtor.com, Redfin, Trulia, etc, you can do that. So we're going to select the Airbnb scraper, click the link below and they'll direct you exactly there. So here the Airbnb scraper is created by someone named Tim. And what this scraper does is extract data for location from Airbnb, you can scrape all homeless things, even for big cities and get all listing information. This can include calendar information reviews, as well as host information. And you can ultimately download that in a consumable fashion like HTML, JSON, CSV, Excel, and so on. So it's super useful here is that we could actually try this Airbnb scraper for free, which is awesome. And we could see here as of early June 10, modify the script only 10 days ago, so he's actively making sure the script is up and running, which is why we want to use third party web scraping tools when available. And if you're new to web scraping, I highly suggest for you to check out my previous video. But as a high level overview, Web Scraping is taking elements from webpages automatically using some programming language. So here, our web scraper gives more information on what the Airbnb web scraper does cost per usage, which apify provides us $5 Free per month, which is enough to be able to get the data that you require as long as it's not for like 10 major cities. In that case, I would say use air DNA. And there is a tutorial here as well if you'd like to follow it, but let's go straight into our input parameters.

Unknown Speaker 04:27

So if we go up top, what we are able to input to get our data back is location. So in this case, I'm going to be focusing on Siesta Key. So yes, the key if you're not aware, it's one of the best beaches in the US. It's actually ranked number two in the US and as on the east coast of Florida. It's super beautiful with white sand beaches since it is on the body of the Gulf Coast. So this is where Siesta Key is located. In our use case, we're going to imagine that we are investors. We're lucky keen to have a set of Airbnb is in a location that's in high demand. And we want to understand well what type of Airbnb or property should we buy? What does the demand look like? What are people currently paying? What's the number of guests are currently staying? And is it a Flex is more entire homes that people are looking for? Or do people look for condos or private rooms? These are all questions, we're going to be able to answer with the data that we get from our scraper, as well as analyzing it in the second video. So now setting up our scraper, where we're going to do is select try for free. Now I already have my own account. But once you click the link below, hit try for free, it's going to ask you to create a free account. And you could do this with Google takes only a couple of seconds. And then you should be landed right here on your dashboard. And on your dashboard. You can see on the left hand side we have information on our actors. So actors are our web scrapers, then we have tasks, runs apify also includes schedules and schedules, it's a great way to routinely check up on our market. So say if we want to see the general trend, if prices are rising or decreasing in our area, make some sort of charts around it. We could actually schedule to extract this information, say on a weekly or monthly basis or even daily if we'd like then we have proxies as well. Proxies improve the performance of our web crawling bots by smartly rotating datacenter and residential IP addresses. It improves the data by enabling access to websites from various geographies. So if you are looking to maybe use this web scraping tool or many different cities, or on a very frequent basis, and I highly suggest to actually get a specific proxy in order to run your web scraper. So let's start now by actually setting up our web scraper. So we have a couple of inputs here we have input info run build task issues integrations, just as a heads up, Airbnb scraper and apify in general has availability to integrate with other systems like Slack to get notifications as well as Zapier. If you want to automatically feed this data into a Google sheet or open whatever storage system that you have at location, you will be typing in the city of your interest for my use case, it's going to be Siesta Key Florida. Then for Max listings, you could set this to none just have the scraper continue until it gets all listings. However, I'm going to set it to 500. As I know, and I'm aware of Siesta Key is a small beach town, so there really shouldn't be any more than 500 listings here. Then we have the option to extract main listing information only or further detail, which in this case does to be quick, we're just only going to get the main listing information.

Ariel Herrera 08:02

Now, if you're not able to find your city and have apify located correctly, what you could do is actually set what URL you want to start at. So for example, here when I have siesta Beach, if I wouldn't want to type in siesta Beach, because maybe there's multiple around the world, I can copy this URL up top, and then paste it. So the apify scraper knows where to start. The next piece is to include reviews, calendar, and host information. And this case, I'm not including any of this for simplicity. But if you'd like to see actual text reviews, and what people are saying, as well as how far in advance these properties are booked, that could also turn into machine learning model right there for forecasting that I highly suggest for you to use these inputs, and make sure that you refer back to the original guides, so you could see all the detailed options that are available. Next, we have filters. So if you want a different currency, you can select whatever currency that is, as well, you can select only to view properties within a certain price range, as well as in checking date. Again, for simplicity, I'm not going to fill any of this out. And lastly, for developers where we select our proxy, because I'm not running this on a reoccurring basis as of yet, I'm just going to select the residential proxy. And we can leave everything else to default. So our next step here is to run the task. So we're going to click Start. Once you click start, you're going to see a lock pop up and it's going to continuously go until your status is succeeded. What this means is that the web scraping bot is actually live trying to retrieve the data and you'll get information on when the home detail has been saved. And if there's any issues, they'll also come up here. This is great for troubleshooting. But once we have our status six needed. And since I wanted to get 500 results, I see 500 results here, I know that I'm ready to move on and download this data, we could see this also took two minutes and 44 seconds, which is awesome. If we were to have developed our own web scraping, by on our own, this could have taken many hours or days. So it really shows that it's important to leverage tools that are already out there, if possible. So once this is done, we can click storage. And here we can see we have several options. We have XML, JSON, CSV, XML, HTML table, and RSS. Now, if we wanted to just view this data right here, we could go to HTML table and view in another tab. And right away, we're able to see the columns as well as rows for our dataset, we have information on the address, location name, as well as nightly price number of guests. And if we go further down, we'll see information on the room type stars and the URL of the actual listing. Now if you select more options, like getting calendar data, as well as host data, you're going to see a lot more columns. And this, this is why I highly suggest to start off very simply, with an easy to use use case, maybe a small city, and then you can expand from there. We have our data now, which is awesome. But what if we want to reference this in the future? Well, we could see here that this is sort of in runs. So if we go to our dashboard, the left hand side, we'll see under actor's is runs. And we could see when we started a run when we finished it and duration it took we could go into the run itself and actually view what was our original input, what did the log look like and then be able to download the data. Please know that with the free version apify keeps the data for seven days. So if you need it for a longer period of time, I highly recommend to upgrade as well. If you want to have this automatically scheduled and then sent to a spreadsheet for example, then I highly suggest to use Zapier in order to connect these together. So as a recap, what we've been able to do is setup our web Scraper with apify using the Airbnb scraper, input our own parameters, run it on an under five minutes, and then be able to view our data. So check out the next video and how we actually analyze this information using Python. Thanks

Previous
Previous

Analyze Short-Term Rental Airbnb Data in Python | Part 2

Next
Next

06-24-22 Tech in Real Estate News | Factors that Influence a Market