Todd Wilson

President at screen-scraper
Screen-scraper logo

At screen-scraper, we’ve been doing web data collection since 2002. So we’ve been around for a while. Within that time, we’ve worked in just about every industry you can imagine, from travel to e-commerce and we do quite a bit with official public records.

We’re a relatively small company, mostly developers and engineers. That’s my background as well, I do a lot of engineering. We have our own products that handle data extraction, and we primarily utilize web data to power our screen-scraper product.

A main focus of ours is official records because that’s a lot of what we do. There is a lot of value in court records. These could be deeds or local county records. It could even be public information that can be used for marketing purposes, or background checks, for example.

When collecting public web data, the experience and knowledge of how different sites behave, having seen different patterns, is key. So, in terms what I think makes us appealing to potential clients is our level of expertise and that we’ve been doing it for longer and better than just about anyone.

I feel like we’re very seasoned, we’re very familiar with the tools that are available. Accessing and downloading these insights from a website is also the kind of thing that is a little bit more of an art than a science. If you’ve never done it before, it’s hard to come into it cold and just start doing it effectively.  

Collecting web data and preparing it for analysis in-house is not easy. We do our best to gather and structure insights for our customers, but at times websites put measures in place to try to protect themselves so that they’re not getting denial of service (DDoS) attacks.

To respond to website admins, we have to adjust parameters on our side, which involves writing millions of lines of code, all the while structuring the data we receive, and at the same time being vigilant for other data collection challenges that may arise. And it goes back and forth.

That’s where Bright Data comes in. If a website puts measures in place to discourage our access, we can utilize Bright Data’s platform to retrieve the public web data we need in the most efficient, reliable, and flexible manner – typically by way of Bright Data’s Residential, Data Center and Mobile networks as well as the Web Unlocker. These allow us to research, monitor, and analyze the data and then pass along the valuable insights to our customers.

Bright Data is the premium service, as far as web data collection and optimization services go. It’s like flipping an on-switch. They save us so much time. Instead of having to investigate, tweak and troubleshoot, we have data that flows like water. It’s simply the best in terms of the extensive network it offers, the platform itself from a technical standpoint, the APIs, the flexibility that we have, and the superior customer service that we receive. It’s just the whole package. 

As we move into 2022, it would surprise me if the need for data collection didn’t continue to grow, especially with more emphasis on incorporating big data strategies, which allows companies to be able to analyze data, look for trends and positively impact their bottom line. 

More testimonials

Victor Bolu
CEO of Webautomation
As a market leader and with an award-winning infrastructure, the partnership with Bright Data allows us to extract public web data from millions of web pages a day without many challenges.
Shameel Abdulla
Co-founder and CEO of Clootrack
Bright Data’s infrastructure is as reliable as it gets. By relying on Bright Data for collecting and structuring public data for us, our customers get the best value for investing in our services.
Alejandro Lechuga
Co-founder and COO at DocFarma
Bright Data seems to be the perfect partner according to its adaptability and efficiency in collecting large volumes of data. And for us, it was very important to get day-to-day data in a short time and with the best quality data.