cURL
cURL command-line examples for using Residential Proxy for quick testing and debugging.
Command-line examples for testing Residential Proxy configurations quickly.
1. Basic Request
The simplest way to test Residential Proxy connectivity. This command routes your request through a US proxy and returns the proxy's IP address.
# Test proxy connection
curl -x "user-country-us:pass123@network.mrproxy.com:10000" https://api.ipify.orgUse Case: Quick connectivity testing and verifying your proxy credentials are working correctly.
2. Static Session Request
Demonstrates session-based proxying where the same session ID (sessid-test1) maintains the same IP address across multiple requests.
# Same IP across multiple requests
curl -x "user-country-us-sessid-test1:pass123@network.mrproxy.com:10000" https://example.com
# Verify same IP
curl -x "user-country-us-sessid-test1:pass123@network.mrproxy.com:10000" https://api.ipify.orgUse Case: Testing scenarios where you need to maintain session state, like logging into websites or simulating a single user's browsing session.
3. Rotating Proxy Test
This loop demonstrates IP rotation by making multiple requests without a session ID. Each request will come from a different IP address.
# Test IP rotation - each request gets different IP
for i in {1..5}; do
echo "Request $i:"
curl -x "user-country-uk:pass123@network.mrproxy.com:10000" https://api.ipify.org
echo ""
doneUse Case: Verifying that IP rotation is working correctly and understanding the different IPs available in your proxy pool.
4. Custom Headers
Shows how to combine proxy usage with custom HTTP headers to appear more like a real browser from a specific region.
# Add custom user agent and headers
curl -x "user-country-de:pass123@network.mrproxy.com:10000" \
-H "User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64)" \
-H "Accept-Language: de-DE" \
https://example.comUse Case: Scraping geo-specific content or avoiding detection by matching your headers to your proxy's location.
5. Save Response
Captures the complete HTTP response body and saves it to a local file for later analysis or processing.
# Save scraped content to file
curl -x "user-country-jp-sessid-save1:pass123@network.mrproxy.com:10000" \
-o output.html \
https://example.comUse Case: Batch downloading of pages, creating local copies of websites, or saving responses for offline analysis.
6. Verbose Output
Enables detailed logging to troubleshoot connection issues, see SSL handshakes, and understand the complete request/response flow.
# Debug proxy connection with verbose output
curl -v -x "user-country-fr:pass123@network.mrproxy.com:10000" \
https://api.ipify.orgUse Case: Debugging proxy connection problems, understanding SSL certificate issues, or verifying that requests are properly routed through the proxy.
7. POST Request
Demonstrates sending POST requests with JSON data through a proxy, useful for API interactions and form submissions.
# POST request through proxy
curl -x "user-country-ca-sessid-post1:pass123@network.mrproxy.com:10000" \
-X POST \
-H "Content-Type: application/json" \
-d '{"key":"value"}' \
https://api.example.com/endpointUse Case: Testing APIs through proxies, submitting forms, or sending data to web services while maintaining anonymity or bypassing geo-restrictions.
8. Custom Session Duration
Controls how long your proxy session remains active using the sesstime parameter, extending beyond the default 10-minute session time.
# 30-minute session
curl -x "user-country-au-sessid-long1-sesstime-30:pass123@network.mrproxy.com:10000" \
https://example.comUse Case: Long-running scripts that need to maintain the same IP for extended periods, such as multi-step authentication flows or comprehensive site crawling.
9. Multiple Requests Script
A practical bash script that demonstrates bulk scraping with status monitoring and rate limiting. Each URL gets a different IP due to proxy rotation.
#!/bin/bash
# Script to scrape multiple URLs with rotating proxy
PROXY="user-country-us:pass123@network.mrproxy.com:10000"
urls=(
"https://example.com/page1"
"https://example.com/page2"
"https://example.com/page3"
"https://example.com/page4"
"https://example.com/page5"
)
for url in "${urls[@]}"; do
echo "Scraping: $url"
curl -x "$PROXY" -s "$url" -o /dev/null -w "Status: %{http_code}\n"
sleep 1 # Delay between requests
doneUse Case: Batch processing multiple URLs with different IPs, monitoring success rates, and implementing respectful scraping practices with delays.
10. Error Handling
Production-ready error handling with timeouts and status code checking. Essential for reliable automated scraping operations.
# Check connection and handle errors
if curl -x "user-country-nl:pass123@network.mrproxy.com:10000" \
--connect-timeout 30 \
--max-time 60 \
-f -s -o /dev/null https://example.com; then
echo "Success"
else
echo "Failed with exit code: $?"
fiUse Case: Building robust automation scripts that can detect and handle network failures, timeouts, and HTTP errors gracefully.
Useful cURL Options
| Option | Description |
|---|---|
-x, --proxy | Specify proxy server |
-H, --header | Add custom header |
-o, --output | Save output to file |
-v, --verbose | Show detailed connection info |
-s, --silent | Silent mode (no progress bar) |
-f, --fail | Fail silently on HTTP errors |
--connect-timeout | Maximum time for connection |
--max-time | Maximum time for entire operation |
-L, --location | Follow redirects |
-A, --user-agent | Set user agent string |