Go Proxy Servers – Set Proxy Servers with Go Guide

In this article, you’ll learn how to set up a proxy server for web scraping in Go.
26 min read
Setting Proxy in GO

All internet interactions require the use of an IP address. Websites use this IP address to identify the user or users and determine your location and other metadata, such as your ISP, time zone, or device type. Web servers use this information to help tailor or restrict content or resources. This means that when web scraping, websites can block requests originating from your IP address if they consider the traffic pattern or behavior to be unusual, bot-like, or malicious. Thankfully, proxy servers can help.

proxy server is an intermediary server that acts as a gateway between a user and the internet. It receives requests from users, forwards them to the web resources, and then returns the fetched data to the users. A proxy server helps you browse and scrape discreetly by hiding your real IP address, enhancing security, privacy, and anonymity.

Proxy servers also help circumvent IP bans by changing your IP address, making it appear as though the requests are coming from different users. Proxy servers located in different regions enable you to access geospecific content, such as movies or news, bypassing geoblocking.

In this article, you’ll learn how to set up a proxy server for web scraping in Go. You’ll also learn about the Bright Data proxy servers and how they can help you simplify this process.

Set Up a Proxy Server

In this tutorial, you’ll learn how to modify a web scraper application written in Go to interact with the It’s FOSS website via a local or self-hosted proxy server. This tutorial assumes you already have your Go development environment set up.

To begin, you need to set up your proxy server using Squid, an open source proxy server software. If you’re familiar with another proxy server software, you may use that as well. The following article uses Squid on a Fedora 39 Linux box. For most Linux distributions, Squid is included in the default repositories. You can also check the documentation to download the necessary packages for your operating system.

From your terminal, execute the following command to install Squid:

dnf install squid -y

Once completed, start the service by executing the following command:

sudo systemctl enable --now squid

Check the status of the service using this command:

sudo systemctl status squid

Your output should look like this:

Go proxy servers

This indicates that the service is active and running. By default, Squid runs and listens to requests on Port 3128. Use the following curl command to test communication via the proxy server:

curl --proxy 127.0.0.1:3128 "http://lumtest.com/myip.json"

Your response should look like this:

curl --proxy 127.0.0.1:3128 "http://lumtest.com/myip.json"
{"ip":"196.43.196.126","country":"GH","asn":{"asnum":327695,"org_name":"AITI"},"geo":{"city":"","region":"","region_name":"","postal_code":"","latitude":8.1,"longitude":-1.2,"tz":"Africa/Accra"}}

The metadata should include your public IP address and the country and organization that owns it. It also confirms that you have successfully installed a working proxy server.

Set Up the Demo Scraper

To make it easier for you to follow along, a simple Go web scraper application is available in this GitHub repository. The scraper captures the title, excerpt, and categories of the latest blog posts on It’s FOSS, a popular blog for discussing open-source software products. The scraper then visits Lumtest to obtain information about the IP address used by the scraper’s HTTP client to interact with the web. The same logic is implemented using three different Go packages: Colly, goquery, and Selenium. In the next section, you’ll learn how to modify each implementation to use a proxy server.

Start by cloning the repository by executing the following command in your favorite terminal/shell:

$ git clone https://github.com/rexfordnyrk/go_scrap_proxy.git

This repository consists of two branches: the main branch, which contains the completed code, and the basic branch, which contains the initial code you’re going to modify. Use the following command to checkout to the basic branch:

$ git checkout basic

This branch contains three .go files for each library implementation of the scraper without a proxy configured. It also contains an executable file chromedriver, which is required by the Selenium implementation of the scraper:

.
├── chromedriver
├── colly.go
├── go.mod
├── goquery.go
├── go.sum
├── LICENSE
├── README.md
└── selenium.go

1 directory, 8 files

You can run any of them individually using the go run command with the specific file name. For instance, the following command runs the scraper with Colly:

go run ./colly.go 

Your output should look like this:

$ go run ./colly.go

Article 0: {"category":"Newsletter ✉️","excerpt":"Unwind your new year celebration with new open-source projects, and keep an eye on interesting distro updates.","title":"FOSS Weekly #24.02: Mixing AI With Linux, Vanilla OS 2, and More"}
Article 1: {"category":"Tutorial","excerpt":"Wondering how to use tiling windows on GNOME? Try the tiling assistant. Here's how it works.","title":"How to Use Tiling Assistant on GNOME Desktop?"}
Article 2: {"category":"Linux Commands","excerpt":"The free command in Linux helps you gain insights on system memory usage (RAM), and more. Here's how to make good use of it.","title":"Free Command Examples"}
Article 3: {"category":"Gaming 🎮","excerpt":"Here are the best tips to make your Linux gaming experience enjoyable.","title":"7 Tips and Tools to Improve Your Gaming Experience on Linux"}
Article 4: {"category":"Newsletter ✉️","excerpt":"The first edition of FOSS Weekly in the year 2024 is here. See, what's new in the new year.","title":"FOSS Weekly #24.01: Linux in 2024, GDM Customization, Distros You Missed Last Year"}
Article 5: {"category":"Tutorial","excerpt":"Wondering which init service your Linux system uses? Here's how to find it out.","title":"How to Check if Your Linux System Uses systemd"}
Article 6: {"category":"Ubuntu","excerpt":"Learn the logic behind each step you have to follow for adding an external repository in Ubuntu and installing packages from it.","title":"Installing Packages From External Repositories in Ubuntu [Explained]"}
Article 7: {"category":"Troubleshoot 🔬","excerpt":"Getting a warning that the boot partition has no space left? Here are some ways you can free up space on the boot partition in Ubuntu Linux.","title":"How to Free Up Space in /boot Partition on Ubuntu Linux?"}
Article 8: {"category":"Ubuntu","excerpt":"Wondering which Ubuntu version you're using? Here's how to check your Ubuntu version, desktop environment and other relevant system information.","title":"How to Check Ubuntu Version Details and Other System Information"}

Check Proxy IP map[asn:map[asnum:29614 org_name:VODAFONE GHANA AS INTERNATIONAL TRANSIT] country:GH geo:map[city:Accra latitude:5.5486 longitude:-0.2012 lum_city:accra lum_region:aa postal_code: region:AA region_name:Greater Accra Region tz:Africa/Accra] ip:197.251.144.148]

This output contains all the scraped article information from It’s FOSS. At the bottom of the output, you’ll find the returned IP information from Lumtest telling you about the current connection used by the scraper. Executing all three implementations should give you a similar response. Once you’ve tested all three, you’re ready to start scraping with a local proxy.

Implementing Scrapers with Local Proxy

In this section, you’ll learn about all three implementations of the scraper and modify them to use your proxy server. Each .go file consists of the main() function where the application starts and the ScrapeWithLibrary() function containing the instructions for scraping.

Using goquery with a Local Proxy

goquery is a library for Go that provides a set of methods and functionalities to parse and manipulate HTML documents, similar to how jQuery works for JavaScript. It’s particularly useful for web scraping as it allows you to traverse, query, and manipulate the structure of HTML pages. However, this library does not handle network requests or operations of any sort, which means you have to obtain and provide the HTML page to it.

If you navigate to the goquery.go file, you’ll find the goquery implementation of the web scraper. Open it in your favorite IDE or text editor.

Inside the ScrapeWithGoquery() function, you need to modify the HTTP client’s transport with your HTTP proxy server’s URL, which is a combination of the hostname or IP and port in the format http://HOST:PORT.

Be sure to import the net/url package in this file. Paste and replace the HTTP client definition with the following snippet:

...

func ScrapeWithGoquery() {
    // Define the URL of the proxy server
    proxyStr := "http://127.0.0.1:3128"

    // Parse the proxy URL
    proxyURL, err := url.Parse(proxyStr)
    if err != nil {
        fmt.Println("Error parsing proxy URL:", err)
        return
    }

    //Create an http.Transport that uses the proxy
    transport := &http.Transport{
        Proxy: http.ProxyURL(proxyURL),
    }

    // Create an HTTP client with the transport
    client := &http.Client{
        Transport: transport,
    }
    
... 

This snippet modifies the HTTP client with a transport configured to use the local proxy server. Make sure you replace the IP address with your proxy server IP address.

Now, run this implementation using the following command from the project directory:

go run ./goquery.go

Your output should look like this:

$ go run ./goquery.go

Article 0: {"category":"Newsletter ✉️","excerpt":"Unwind your new year celebration with new open-source projects, and keep an eye on interesting distro updates.","title":"FOSS Weekly #24.02: Mixing AI With Linux, Vanilla OS 2, and More"}
Article 1: {"category":"Tutorial","excerpt":"Wondering how to use tiling windows on GNOME? Try the tiling assistant. Here's how it works.","title":"How to Use Tiling Assistant on GNOME Desktop?"}
Article 2: {"category":"Linux Commands","excerpt":"The free command in Linux helps you gain insights on system memory usage (RAM), and more. Here's how to make good use of it.","title":"Free Command Examples"}
Article 3: {"category":"Gaming 🎮","excerpt":"Here are the best tips to make your Linux gaming experience enjoyable.","title":"7 Tips and Tools to Improve Your Gaming Experience on Linux"}
Article 4: {"category":"Newsletter ✉️","excerpt":"The first edition of FOSS Weekly in the year 2024 is here. See, what's new in the new year.","title":"FOSS Weekly #24.01: Linux in 2024, GDM Customization, Distros You Missed Last Year"}
Article 5: {"category":"Tutorial","excerpt":"Wondering which init service your Linux system uses? Here's how to find it out.","title":"How to Check if Your Linux System Uses systemd"}
Article 6: {"category":"Ubuntu","excerpt":"Learn the logic behind each step you have to follow for adding an external repository in Ubuntu and installing packages from it.","title":"Installing Packages From External Repositories in Ubuntu [Explained]"}
Article 7: {"category":"Troubleshoot 🔬","excerpt":"Getting a warning that the boot partition has no space left? Here are some ways you can free up space on the boot partition in Ubuntu Linux.","title":"How to Free Up Space in /boot Partition on Ubuntu Linux?"}
Article 8: {"category":"Ubuntu","excerpt":"Wondering which Ubuntu version you're using? Here's how to check your Ubuntu version, desktop environment and other relevant system information.","title":"How to Check Ubuntu Version Details and Other System Information"}

Check Proxy IP map[asn:map[asnum:29614 org_name:VODAFONE GHANA AS INTERNATIONAL TRANSIT] country:GH geo:map[city:Accra latitude:5.5486 longitude:-0.2012 lum_city:accra lum_region:aa postal_code: region:AA region_name:Greater Accra Region tz:Africa/Accra] ip:197.251.144.148]

Using Colly with a Local Proxy

Colly is a versatile and efficient web scraping framework for Go, known for its user-friendly API and seamless integration with HTML parsing libraries like goquery. However, unlike goquery, it supports and provides API for handling various network-related behaviors, including asynchronous requests for high-speed scraping, local caching, and rate limiting to ensure efficient and responsible use of web resources, automatic handling of cookies and sessions, customizable user agents, and comprehensive error handling. Additionally, it supports proxy usage with proxy switching or rotation, and it can be extended for tasks like scraping JavaScript-generated content by integrating with headless browsers.

Open the colly.go file in your editor or IDE and paste the following lines of code right after initializing a new collector inside the ScrapeWithColly() function:

...
    // Define the URL of the proxy server
    proxyStr := "http://127.0.0.1:3128"
    // SetProxy sets a proxy for the collector
    if err := c.SetProxy(proxyStr); err != nil {
        log.Fatalf("Error setting proxy configuration: %v", err)
    }
    
...

This snippet uses Colly’s SetProxy() method to define the proxy server to be used by this collector instance for network requests.

Now, run this implementation using the following command from the project directory:

go run ./colly.go

Your output should look like this:

$ go run ./colly.go

Article 0: {"category":"Newsletter ✉️","excerpt":"Unwind your new year celebration with new open-source projects, and keep an eye on interesting distro updates.","title":"FOSS Weekly #24.02: Mixing AI With Linux, Vanilla OS 2, and More"}
Article 1: {"category":"Tutorial","excerpt":"Wondering how to use tiling windows on GNOME? Try the tiling assistant. Here's how it works.","title":"How to Use Tiling Assistant on GNOME Desktop?"}
Article 2: {"category":"Linux Commands","excerpt":"The free command in Linux helps you gain insights on system memory usage (RAM), and more. Here's how to make good use of it.","title":"Free Command Examples"}
Article 3: {"category":"Gaming 🎮","excerpt":"Here are the best tips to make your Linux gaming experience enjoyable.","title":"7 Tips and Tools to Improve Your Gaming Experience on Linux"}
Article 4: {"category":"Newsletter ✉️","excerpt":"The first edition of FOSS Weekly in the year 2024 is here. See, what's new in the new year.","title":"FOSS Weekly #24.01: Linux in 2024, GDM Customization, Distros You Missed Last Year"}
Article 5: {"category":"Tutorial","excerpt":"Wondering which init service your Linux system uses? Here's how to find it out.","title":"How to Check if Your Linux System Uses systemd"}
Article 6: {"category":"Ubuntu","excerpt":"Learn the logic behind each step you have to follow for adding an external repository in Ubuntu and installing packages from it.","title":"Installing Packages From External Repositories in Ubuntu [Explained]"}
Article 7: {"category":"Troubleshoot 🔬","excerpt":"Getting a warning that the boot partition has no space left? Here are some ways you can free up space on the boot partition in Ubuntu Linux.","title":"How to Free Up Space in /boot Partition on Ubuntu Linux?"}
Article 8: {"category":"Ubuntu","excerpt":"Wondering which Ubuntu version you're using? Here's how to check your Ubuntu version, desktop environment and other relevant system information.","title":"How to Check Ubuntu Version Details and Other System Information"}

Check Proxy IP map[asn:map[asnum:29614 org_name:VODAFONE GHANA AS INTERNATIONAL TRANSIT] country:GH geo:map[city:Accra latitude:5.5486 longitude:-0.2012 lum_city:accra lum_region:aa postal_code: region:AA region_name:Greater Accra Region tz:Africa/Accra] ip:197.251.144.148]

Using Selenium with a Local Proxy

Selenium is a tool that’s primarily used for automating web browser interactions in web application testing. It’s capable of performing tasks like clicking buttons, entering text, and extracting data from web pages, making it ideal for scraping web content with automated interactions. The mimicking of real user interactions is made possible via WebDriver, which Selenium uses to control browsers. While this example uses Chrome, Selenium also supports other browsers, including Firefox, Safari, and Internet Explorer.

The Selenium WebDriver service lets you provide a proxy and other configurations to influence the behavior of the underlying browser when interacting with the web, just like an actual browser. Programmatically, this can be configured via the selelium.Capabilities{} definition.

To use Selenium with a local proxy, edit the selenium.go file inside ScrapeWithSelenium() and replace the selelium.Capabilities{} definition with the following snippet:

...

    // Define proxy settings
    proxy := selenium.Proxy{
        Type: selenium.Manual,
        HTTP: "127.0.0.1:3128", // Replace with your proxy settings
        SSL:  "127.0.0.1:3128", // Replace with your proxy settings
    }

    // Configuring the WebDriver instance with the proxy
    caps := selenium.Capabilities{
        "browserName": "chrome",
        "proxy":       proxy,
    }
    
...

This snippet defines the various proxy parameters for Selenium, which is used to configure Selenium’s capabilities for the WebDriver. On the next execution, the proxy connection will be used.

Now, run the implementation using the following command from the project directory:

go run ./selenium.go

Your output should look like this:

$ go run ./selenium.go

Article 0: {"category":"Newsletter ✉️","excerpt":"Unwind your new year celebration with new open-source projects, and keep an eye on interesting distro updates.","title":"FOSS Weekly #24.02: Mixing AI With Linux, Vanilla OS 2, and More"}
Article 1: {"category":"Tutorial","excerpt":"Wondering how to use tiling windows on GNOME? Try the tiling assistant. Here's how it works.","title":"How to Use Tiling Assistant on GNOME Desktop?"}
Article 2: {"category":"Linux Commands","excerpt":"The free command in Linux helps you gain insights on system memory usage (RAM), and more. Here's how to make good use of it.","title":"Free Command Examples"}
Article 3: {"category":"Gaming 🎮","excerpt":"Here are the best tips to make your Linux gaming experience enjoyable.","title":"7 Tips and Tools to Improve Your Gaming Experience on Linux"}
Article 4: {"category":"Newsletter ✉️","excerpt":"The first edition of FOSS Weekly in the year 2024 is here. See, what's new in the new year.","title":"FOSS Weekly #24.01: Linux in 2024, GDM Customization, Distros You Missed Last Year"}
Article 5: {"category":"Tutorial","excerpt":"Wondering which init service your Linux system uses? Here's how to find it out.","title":"How to Check if Your Linux System Uses systemd"}
Article 6: {"category":"Ubuntu","excerpt":"Learn the logic behind each step you have to follow for adding an external repository in Ubuntu and installing packages from it.","title":"Installing Packages From External Repositories in Ubuntu [Explained]"}
Article 7: {"category":"Troubleshoot 🔬","excerpt":"Getting a warning that the boot partition has no space left? Here are some ways you can free up space on the boot partition in Ubuntu Linux.","title":"How to Free Up Space in /boot Partition on Ubuntu Linux?"}
Article 8: {"category":"Ubuntu","excerpt":"Wondering which Ubuntu version you're using? Here's how to check your Ubuntu version, desktop environment and other relevant system information.","title":"How to Check Ubuntu Version Details and Other System Information"}

Check Proxy IP {"ip":"197.251.144.148","country":"GH","asn":{"asnum":29614,"org_name":"VODAFONE GHANA AS INTERNATIONAL TRANSIT"},"geo":{"city":"Accra","region":"AA","region_name":"Greater Accra Region","postal_code":"","latitude":5.5486,"longitude":-0.2012,"tz":"Africa/Accra","lum_city":"accra","lum_region":"aa"}}

While you can maintain a proxy server yourself, you’re limited by various factors, including setting up a new server for various regions as well as doing other maintenance and security issues.

Bright Data Proxy Servers

Bright Data offers an award-winning global proxy network infrastructure with a comprehensive set of proxy servers and services that can be used for various web data-gathering purposes.

With the extensive global network of Bright Data proxy servers, you can easily access and collect data from various international locations. Bright Data also provides a range of proxy types, including over 350 million unique residentialISPdatacenter, and mobile proxies, each offering unique benefits, like legitimacy, speed, and reliability, for specific web data-gathering tasks.

Additionally, the Bright Data proxy rotation system ensures high anonymity and minimizes detection, making it ideal for continuous and large-scale web data collection.

Setting Up a Residential Proxy with Bright Data

It’s easy to obtain a residential proxy with Bright Data. All you have to do is sign up for a free trial. Once you’ve signed up, you’ll see something like this:

BD dashboard

Click on the Get started button for Residential Proxies.

You’ll be prompted to fill in the following form:

BD res proxies set up

Go ahead and provide a name for this instance. Here, it’s my_go_demo_proxy. You also need to specify the IP type to be provisioned: select Shared (if you want to use shared proxies). Then provide the geolocation level you would like to mimic when accessing the web content. By default, this is Country level or zone. You also need to specify if you want the web pages you request cached. For now, turn caching off.

After filling out this information, click Add to create and make provisions for your residential proxy.

Next, you need to activate your residential proxy. However, as a new user, you’ll first be asked to provide your billing information. Once you’ve completed that step, navigate to your dashboard and click on the residential proxy you just created:

set proxies in go

Make sure the Access parameters tab is selected.

Here, you’ll find the various parameters needed to use the residential proxy, such as the host, port, and authentication credentials. You’ll need this information soon.

Now, it’s time to integrate your Bright Data residential proxy with all three implementations of the scraper. While this is a similar process to what was done for the local server, you’ll also include authentication here. Also, since you are interacting with the web programmatically, it may not be possible to review and accept SSL certificates from the proxy server as you would in a browser with a graphical user interface. You, therefore, need to disable SSL certificate verification on your web client programmatically to have your requests uninterrupted.

Begin by creating a directory called brightdata in the project directory and copy the three .go files into the brightdata directory. Your directory structure should look like this:

.
├── brightdata
│   ├── colly.go
│   ├── goquery.go
│   └── selenium.go
├── chromedriver
├── colly.go
├── go.mod
├── goquery.go
├── go.sum
├── LICENSE
├── README.md
└── selenium.go

2 directories, 11 files

Going forward, you’ll be modifying the files in the brightdata directory.

Using goquery with a Bright Data Residential Proxy

Inside the ScrapeWithGoquery() function, you need to modify the proxyStr variable to include the authentication credentials in the proxy URL in the format http://USERNAME:PASSWORD@HOST:PORT. Replace the current definition with the following snippet:

...

func ScrapeWithGoquery() {
    // Define the proxy server with username and password
    proxyUsername := "username" //Your residential proxy username 
    proxyPassword := "your_password" //Your Residential Proxy password here
    proxyHost := "server_host" //Your Residential Proxy Host
    proxyPort := "server_port"  //Your Port here
    
    proxyStr := fmt.Sprintf("http://%s:%s@%s:%s", url.QueryEscape(proxyUsername), url.QueryEscape(proxyPassword), proxyHost, proxyPort)
    
    // Parse the proxy URL
...

Then you need to modify the HTTP client’s transport with a configuration to ignore verifying the SSL/TLS certificate of the proxy server. Start by adding the crypto/tls package to your imports. Then replace the http.Transport definition with the following snippet after parsing the proxy URL:

...

func ScrapeWithGoquery() {
    
    // Parse the proxy URL
...

    //Create an http.Transport that uses the proxy
    transport := &http.Transport{
        Proxy: http.ProxyURL(proxyURL),
        TLSClientConfig: &tls.Config{
            InsecureSkipVerify: true, // Disable SSL certificate verification
        },
    }

    // Create an HTTP client with the transport
... 

This snippet modifies the HTTP client with a transport configured to use the local proxy server. Make sure you replace the IP address with that of your proxy server.

Then run this implementation using the following command from the project directory:

go run brightdata/goquery.go 

Your output should looks like this:

$ go run brightdata/goquery.go 

Article 0: {"category":"Newsletter ✉️","excerpt":"Open source rival to Twitter, a hyped new terminal and a cool new Brave/Chrome feature among many other things.","title":"FOSS Weekly #24.07: Fedora Atomic Distro, Android FOSS Apps, Mozilla Monitor Plus and More"}
Article 1: {"category":"Explain","excerpt":"Intel makes things confusing, I guess. Let's try making the processor naming changes simpler.","title":"Intel Processor Naming Changes: All You Need to Know"}
Article 2: {"category":"Linux Commands","excerpt":"The Cut command lets you extract a part of the file to print without affecting the original file. Learn more here.","title":"Cut Command Examples"}
Article 3: {"category":"Raspberry Pi","excerpt":"A UART attached to your Raspberry Pi can help you troubleshoot issues with your Raspberry Pi. Here's what you need to know.","title":"Using a USB Serial Adapter (UART) to Help Debug Your Raspberry Pi"}
Article 4: {"category":"Newsletter ✉️","excerpt":"Damn Small Linux resumes development after 16 years.","title":"FOSS Weekly #24.06: Ollama AI, Zorin OS Upgrade, Damn Small Linux, Sudo on Windows and More"}
Article 5: {"category":"Tutorial","excerpt":"Zorin OS now provides a way to upgrade to a newer major version. Here's how to do that.","title":"How to upgrade to Zorin OS 17"}
Article 6: {"category":"Ubuntu","excerpt":"Learn the logic behind each step you have to follow for adding an external repository in Ubuntu and installing packages from it.","title":"Installing Packages From External Repositories in Ubuntu [Explained]"}
Article 7: {"category":"Troubleshoot 🔬","excerpt":"Getting a warning that the boot partition has no space left? Here are some ways you can free up space on the boot partition in Ubuntu Linux.","title":"How to Free Up Space in /boot Partition on Ubuntu Linux?"}
Article 8: {"category":"Ubuntu","excerpt":"Wondering which Ubuntu version you’re using? Here’s how to check your Ubuntu version, desktop environment and other relevant system information.","title":"How to Check Ubuntu Version Details and Other System Information"}

Check Proxy IP map[asn:map[asnum:7922 org_name:COMCAST-7922] country:US geo:map[city:Crown Point latitude:41.4253 longitude:-87.3565 lum_city:crownpoint lum_region:in postal_code:46307 region:IN region_name:Indiana tz:America/Chicago] ip:73.36.77.244]

You’ll notice that even though you’re scraping the same articles, the proxy IP check returned different information which indicates surfing from a different location or country.

Using Colly with a Bright Data Residential Proxy

Even though Colly doesn’t provide a method for programmatically disabling SSL/TLS verification, it does offer one where you can provide your own transport to be used by its HTTP client.

With the colly.go file opened in your editor or IDE, paste the following lines of code after initializing a new collector inside the ScrapeWithColly() function (don’t forget to add the net/url and net/http imports):

...
func ScrapeWithColly() {
    ...
    
    //Create an http.Transport that uses the proxy
    transport := &http.Transport{
        TLSClientConfig: &tls.Config{
            InsecureSkipVerify: true, // Disable SSL certificate verification
        },
    }
    
    // Set the collector instance to use the configured transport
    c.WithTransport(transport)
    
    
...

This snippet defines an HTTP transport with SSL verification disabled and uses the Colly WithTransport() method to set the collector’s transport for network requests.

Modify the proxyStr variable to contain the residential proxy credentials (just like you did for goquery). Replace the proxyStr line with the following snippet:

...

    // Define the proxy server with username and password
    proxyUsername := "username" //Your residential proxy username 
    proxyPassword := "your_password" //Your Residential Proxy password here
    proxyHost := "server_host" //Your Residential Proxy Host
    proxyPort := "server_port"  //Your Port here

    proxyStr := fmt.Sprintf("http://%s:%s@%s:%s", url.QueryEscape(proxyUsername), url.QueryEscape(proxyPassword), proxyHost, proxyPort)

...

Don’t forget to replace the string values with the ones from the Access parameters page of your residential proxy.

Next, run this implementation using the following command from the project directory:

go run brightdata/colly.go
go run brightdata/colly.go 
…

Check Proxy IP map[asn:map[asnum:2856 org_name:British Telecommunications PLC] country:GB geo:map[city:Turriff latitude:57.5324 longitude:-2.3883 lum_city:turriff lum_region:sct postal_code:AB53 region:SCT region_name:Scotland tz:Europe/London] ip:86.180.236.254]

In the “Check Proxy IP” part of the output, you’ll notice the change of country even though the same credentials are being used.

Using Selenium with a Bright Data Residential Proxy

When working with Selenium, you have to modify the selenium.Proxy{} definition to use the proxy URL string with the credentials. Replace the current proxy definition with the following:

...

    // Define the proxy server with username and password
    proxyUsername := "username"      //Your residential proxy username
    proxyPassword := "your_password" //Your Residential Proxy password here
    proxyHost := "server_host"       //Your Residential Proxy Host
    proxyPort := "server_port"       //Your Port here

    proxyStr := fmt.Sprintf("http://%s:%s@%s:%s", url.QueryEscape(proxyUsername), url.QueryEscape(proxyPassword), proxyHost, proxyPort)

    // Define proxy settings
    proxy := selenium.Proxy{
        Type: selenium.Manual,
        HTTP: proxyStr,
        SSL:  proxyStr,
    }
    
...

Don’t forget to import the net/url package.

This snippet defines the various proxy parameters and is merged to create the proxy URL used in the proxy configuration.

Now, the Chrome WebDriver needs to be configured with options to disable SSL verification while using the residential proxy as was done similarly for the previous implementations. To do so, modify the chromeCaps definition arguments to include the --ignore-certificate-errors option like this:

... 
    caps.AddChrome(chrome.Capabilities{Args: []string{
        "--headless=new", // Start browser without UI as a background process
        "--ignore-certificate-errors", // // Disable SSL certificate verification
    }})
...

By default, Selenium does not support authenticated proxy configuration. However, you can get around this using a small package to build a Chrome extension for an authenticated proxy connection.

First, add the package to your project using this go get command:

go get https://github.com/rexfordnyrk/proxyauth

Then, import the package into the brightdata/selenium.go file by adding the line "github.com/rexfordnyrk/proxyauth" into the import block at the top of the file.

Next, you need to build the Chome extension using the BuildExtension() method from the proxyauth package and pass it along with your Bright Data Residential Proxy credentials. To do so, paste the following code snippet after the chromeCaps definition but before the caps.AddChrome(chromeCaps) line:

…
    //Building proxy auth extension using BrightData Proxy credentials
    extension, err := proxyauth.BuildExtention(proxyHost, proxyPort, proxyUsername, proxyPassword)
    if err != nil {
        log.Fatal("BuildProxyExtension Error:", err)
    }

    //including the extension to allow proxy authentication in chrome
    if err := chromeCaps.AddExtension(extension); err != nil {
        log.Fatal("Error adding Extension:", err)
    }

…

This snippet creates a Chrome extension and adds it to the Chrome WebDriver to enable authenticated web requests through the provided proxy credentials.

You can run this implementation using the following command from the project directory:

go run brightdata/selenium.go

Your output should look like this:

$ go run brightdata/selenium.go 

Article 0: {"categoryText":"Newsletter ✉️","excerpt":"Check out the promising new features in Ubuntu 24.04 LTS and a new immutable distro.","title":"FOSS Weekly #24.08: Ubuntu 24.04 Features, Arkane Linux, grep, Fedora COSMIC and More"}
…
Article 8: {"categoryText":"Ubuntu","excerpt":"Wondering which Ubuntu version you’re using? Here’s how to check your Ubuntu version, desktop environment and other relevant system information.","title":"How to Check Ubuntu Version Details and Other System Information"}

Check Proxy IP {"ip":"176.45.169.166","country":"SA","asn":{"asnum":25019,"org_name":"Saudi Telecom Company JSC"},"geo":{"city":"Riyadh","region":"01","region_name":"Riyadh Region","postal_code":"","latitude":24.6869,"longitude":46.7224,"tz":"Asia/Riyadh","lum_city":"riyadh","lum_region":"01"}}

Once again, if you look at the IP information at the bottom of the output you’ll notice a different country is used to send the request too. This is the Bright Data proxy rotation system in action.

As you can see, using Bright Data in your Go application is easy. First, you create the residential proxy on the Bright Data platform and obtain your credentials. Second, you use that information to modify your code to use the proxy for the web.

Conclusion

Web proxy servers are a crucial component for tailored user interactions on the internet. In this article, you learned all about proxy servers and how to set up a self-hosted proxy server using Squid. You also learned how to integrate a local proxy server into your Go applications, a web scraper in this case.

If you’re interested in working with proxy servers, you should consider using Bright Data. Its state-of-the-art proxy network can help you quickly gather data without worrying about any additional infrastructure or maintenance.