AdRetreaver is a software development firm that builds tools for marketers.
Our products help our customers verify advertisement placements across multiple geo-locations, protect their copyrighted content against illegal use as well as boost their original content, music or business to appear and rank higher in search in order to gain a market advantage over their competition.
Our main tool is the Ad Verifier. With this tool our customers provide us with information about their advertisement copy, as well as the location and geo-specific details surrounding the content. From there we scan to verify that the advertisements are displayed in the correct format based on the geo-location of the materials.
We are also moving into the data analytics space, where we will offer our customers tools that provide business intelligence, market analytics etc. So, we’re going to build tools for clients to help them gather information on their competitors or help them place their ads more effectively, for example, and provide them with actionable market intelligence that they can use to make better informed decisions.
Ad-placement-recognition is one of our market differentiators. Our main customers are marketers, such as ad agencies. These companies have multiple advertisement campaigns running worldwide simultaneously, and need to verify that their advertisements are being displayed in the correct format as well as location.
With our Ad Verifier, we’re actually able to go in and verify the advertisement right down to the city or location our customers ask for. We can make sure that they’re not saying anything that’s out of compliance or that they’re not promising something the client we’re representing can’t do.
This is specifically important as advertising laws vary from country to country, meaning the content needs to be exact. The more ads an advertiser has running at one time, the harder this task becomes to complete.
So for us, it’s more of security checks and balances for our customers, to make sure that the people that they are advertising with are doing exactly what they requested and that it falls within compliance. Because at the end of the day, if violated, the agencies are the ones who are going to pay.
In order to perform ad verification for our clients, we gather and leverage public web data from the different public websites our advertisers appear across globally.
The ability to gather and analyze public web data at scale is vital to our operation, as web data gathering allows us to automatically verify our customers’ advertisements are reaching their target destination and are not spending money on unnecessary budgets.
Considering this task would be impossible to perform manually, public web data provides the only avenue to verifying advertisements at scale from location to location.
The internet is sectioned off from location to location. If we click on the same link, but I’m in London and you are in New York, we are going to be shown different content. Public web data helps us see what a real user would see in that location when arriving on a website, instead of a sectioned off portion of the website, which allows us to verify the content with 100% certainty.
But data has further importance to our organization. I’ll refer to the quote made by W. Edwards Deming: ”In G-d we trust. All others must bring data,” which refers mainly to the importance of data measurement and analysis when doing business.
Data is what this world is based upon. The more data you have on a certain topic, the more valuable that topic becomes. So, we’re moving to becoming a more data-driven company.
I think that within the next six months or so the way we do business with agencies or advertisers, is going to change because we’re now able to take the data that we already have, using the products that we’ve built out, and help our customers target their audience in a more focused manner, based on different variables such as demographics, age, preferences, etc.
In order to collect public web data, we use Bright Data’s extensive Residential network, which provides us the option to gather public web data from as many different locations as possible.
We also recently began experimenting with the Bright Data Web Scraper IDE, which assists us in the process of automatically collecting public web data from our targeted websites – helping us to narrow down the time we spend on coding, website changes as well as combatting the different blocking techniques of these websites.
Overall, we are very pleased with the partnership with Bright Data.
Everything’s been good, the network has been very stable, we’re happy with the customer service and the support staff is bar none in our book.