Continuous High-Performance With Bright Data’s Static Networks

Our free upgrade for Datacenter and Static Residential networks
100 percent uptime flag at the top of the mountain, next to a clock to represent uptime
Aviv Besinksky
Aviv Besinsky | Product Manager
26-Feb-2020

In this post I’ll describe some of the typical challenges providers of Datacenter and Static Residential (‘static’) proxy networks face and explain how a great new feature we developed – 100% uptime – manages to solve and deliver immediate value to our customers, free of charge and without having to make any changes in your code or how you work.

Providing and maintaining high-quality static proxy networks involves 2 types of challenges:

  • Internal – keeping top-notch performance through good architecture and various internal processes like good monitoring, fast problem solving/reporting and high quality of hardware, software and resources used by the networks.
  • External – any event which is outside the provider’s direct reach but affects the performance of the provided service.

It’s fair to say we are pretty good at managing the internal challenges listed above, this is what allowed us to become a market leader in our field. In this article I’ll address a significant breakthrough we developed in how we manage the external challenges with the new 100% uptime feature, that allows us to solve 2 problems which I’ll list below.

Connectivity

“Connectivity” is one word but includes many players and processes which keep the internet running, most sitting under data centers, connectivity providers (‘upstream’) and network carriers. Just like any other tech service, they too need to perform periodic maintenance and experience unexpected problems. Any such event, for any link in the chain, will affect all end connectivity consumers.

GEO quality

High GEO quality of IPs means that all main geolocation databases are showing the correct geolocation for these IPs. Now you might be asking yourself – isn’t that always the case? Well, it’s not. This is also an aspect that most providers prefer to keep in the shadows so that the user isn’t aware of the GEO quality concept. We take different actions to make sure all IPs in our network are of the highest GEO quality, including curating IPs before initially adding them, monitoring and quickly fixing when needed. But mistakes by geolocation databases still happen, due to human error or bad design of automated testing that uses outdated sources, eventually lowering the GEO quality of IPs and potentially affecting your performance.

The solution

The 100% uptime was built to prevent any ‘external’ events from affecting the user. The idea is simple and works the same way for both problems described above – if our system detects a problem, like a connectivity issue or an IP where the GEO isn’t exactly what you asked for when buying that IP, we will automatically route your requests through other IPs which are exactly the same as the original IPs.

At the same time, we wanted to make sure that customers that must use specific IPs will not be affected – so we made some exceptions:

  • If a specific IP is targeted in your request we will not assign a fallback IP to it
  • 100% uptime will not interrupt a live connection. If the fallback is needed, it will play in once the next connection is established

The 100% uptime brings immediate value by providing 100% connectivity and continuous high-performance level, free of charge and without having to make any changes in your code or how you work.

Aviv Besinksky
Aviv Besinsky | Product Manager

Aviv is a lead product manager at Bright Data. He has been a driving force in taking data collection technology to the next level - developing technological solutions in the realms of data unblocking, static proxy networks, and more. Sharing his data crawling know-how is one of his many passions.

You might also be interested in

What is data aggregation

Data Aggregation – Definition, Use Cases, and Challenges

This blog post will teach you everything you need to know about data aggregation. Here, you will see what data aggregation is, where it is used, what benefits it can bring, and what obstacles it involves.
What is a data parser featured image

What Is Data Parsing? Definition, Benefits, and Challenges

In this article, you will learn everything you need to know about data parsing. In detail, you will learn what data parsing is, why it is so important, and what is the best way to approach it.
What is a web crawler featured image

What is a Web Crawler?

Web crawlers are a critical part of the infrastructure of the Internet. In this article, we will discuss: Web Crawler Definition A web crawler is a software robot that scans the internet and downloads the data it finds. Most web crawlers are operated by search engines like Google, Bing, Baidu, and DuckDuckGo. Search engines apply […]

A Hands-On Guide to Web Scraping in R

In this tutorial, we’ll go through all the steps involved in web scraping in R with rvest with the goal of extracting product reviews from one publicly accessible URL from Amazon’s website.

The Ultimate Web Scraping With C# Guide

In this tutorial, you will learn how to build a web scraper in C#. In detail, you will see how to perform an HTTP request to download the web page you want to scrape, select HTML elements from its DOM tree, and extract data from them.
Javascript and node.js web scraping guide image

Web Scraping With JavaScript and Node.JS

We will cover why frontend JavaScript isn’t the best option for web scraping and will teach you how to build a Node.js scraper from scratch.
Web scraping with JSoup

Web Scraping in Java With Jsoup: A Step-By-Step Guide

Learn to perform web scraping with Jsoup in Java to automatically extract all data from an entire website.
Static vs. Rotating Proxies

Static vs Rotating Proxies: Detailed Comparison

Proxies play an important role in enabling businesses to conduct critical web research.