Use Cases & Examples
Practical examples and code snippets for common proxy use cases.
Common Use Cases
π E-commerce & Price Monitoring
Track competitor prices, monitor product availability, and aggregate market data.
Benefits:
- Real-time price tracking across multiple retailers
- Avoid IP blocks with residential IPs
- Access region-specific pricing
- Monitor stock levels and availability
π Market Research & Data Collection
Collect market intelligence, analyze trends, and gather competitive insights.
Benefits:
- Scrape review sites and forums
- Collect social media sentiment data
- Monitor brand mentions across the web
- Analyze competitor strategies
π SEO & SERP Monitoring
Track search engine rankings, monitor SERP features, and verify ad placements.
Benefits:
- Check rankings from different locations
- Monitor local search results
- Track featured snippets and SERP features
- Verify ad placements and costs
π― Ad Verification
Verify ad placements, detect fraud, and ensure campaign compliance.
Benefits:
- Verify ads display correctly in different regions
- Detect click fraud and bot traffic
- Monitor competitor ad strategies
- Ensure brand safety
π± Social Media Management
Manage multiple accounts, automate engagement, and gather social insights.
Benefits:
- Manage accounts from different locations
- Avoid platform detection and bans
- Maintain consistent IP per account
- Automate posting and engagement
Example 1: Web Scraping
Collect product data from e-commerce sites with rotating proxies to avoid detection:
import requests
from bs4 import BeautifulSoup
import time
# Your PepeProxy credentials
PROXY = 'http://username:password@us-01.pepeproxy.com:2333'
proxies = {
'http': PROXY,
'https': PROXY
}
def scrape_product(url):
"""Scrape product information from e-commerce site"""
headers = {
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36'
}
try:
response = requests.get(
url,
proxies=proxies,
headers=headers,
timeout=30
)
response.raise_for_status()
soup = BeautifulSoup(response.content, 'html.parser')
product = {
'title': soup.find('h1', class_='product-title').text.strip(),
'price': soup.find('span', class_='price').text.strip(),
'availability': soup.find('div', class_='stock').text.strip()
}
return product
except requests.exceptions.RequestException as e:
print(f"Error scraping {url}: {e}")
return None
# Scrape multiple products with delays
product_urls = [
'https://example.com/product/1',
'https://example.com/product/2',
'https://example.com/product/3'
]
for url in product_urls:
product = scrape_product(url)
if product:
print(f"Found: {product['title']} - {product['price']}")
time.sleep(2) # Be respectful with delays Best Practices for Web Scraping
- Use rotating proxies to distribute requests across many IPs
- Add delays between requests (2-5 seconds minimum)
- Use realistic User-Agent headers
- Respect robots.txt and rate limits
- Handle errors gracefully with retry logic
Example 2: Price Monitoring
Monitor competitor prices and get alerted when prices drop below your target:
import axios from 'axios';
import { HttpsProxyAgent } from 'https-proxy-agent';
const PROXY = 'http://username:password@us-01.pepeproxy.com:2333';
const agent = new HttpsProxyAgent(PROXY);
interface Product {
url: string;
name: string;
targetPrice: number;
}
async function checkPrice(product: Product): Promise<void> {
try {
const response = await axios.get(product.url, {
httpsAgent: agent,
headers: {
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64)',
},
timeout: 30000,
});
// Parse price from response
const priceMatch = response.data.match(/$([\d,]+\.\d{2})/);
if (priceMatch) {
const currentPrice = parseFloat(priceMatch[1].replace(',', ''));
console.log(`${product.name}: $${currentPrice}`);
if (currentPrice <= product.targetPrice) {
console.log(`π Price alert! ${product.name} is now $${currentPrice}`);
// Send notification (email, Telegram, etc.)
}
}
} catch (error) {
console.error(`Error checking ${product.name}:`, error.message);
}
}
// Monitor multiple products
const products: Product[] = [
{
url: 'https://example.com/product/laptop',
name: 'Gaming Laptop',
targetPrice: 999,
},
{
url: 'https://example.com/product/phone',
name: 'Smartphone',
targetPrice: 699,
},
];
// Check prices every hour
setInterval(async () => {
console.log('Checking prices...');
for (const product of products) {
await checkPrice(product);
await new Promise((resolve) => setTimeout(resolve, 3000)); // 3s delay
}
}, 3600000); // 1 hour Best Practices for Price Monitoring
- Use rotating proxies to check from different IPs
- Set appropriate check intervals (hourly or daily)
- Store historical price data for trend analysis
- Implement notification systems (email, SMS, Telegram)
Proxy Configuration Recommendations
For Web Scraping
- Protocol: HTTP/HTTPS
- Session: Rotating
- Location: Country or City based on target
- Quantity: 5-20 proxies in rotation
For Account Management
- Protocol: HTTP/HTTPS
- Session: Sticky (10-20 min)
- Location: City-level matching account location
- Quantity: 1 proxy per account
For SEO Monitoring
- Protocol: HTTP/HTTPS
- Session: Rotating
- Location: City-level for local SEO
- Quantity: Multiple proxies for different locations
For Ad Verification
- Protocol: HTTP/HTTPS
- Session: Sticky (5-10 min)
- Location: City-level with ISP targeting
- Quantity: Multiple proxies per location/ISP combo
Error Handling & Retries
Always implement robust error handling when working with proxies:
- Timeouts: Set connection and read timeouts (30-60 seconds)
- Retries: Retry failed requests 3-5 times with exponential backoff
- Proxy Rotation: Switch to a different proxy on repeated failures
- Status Codes: Handle 407 (auth), 429 (rate limit), 403/503 (blocked)
- Logging: Log all errors with timestamps for debugging
Rate Limiting & Politeness
Even with proxies, itβs important to be respectful:
- Add delays between requests (2-5 seconds minimum)
- Respect robots.txt and crawl-delay directives
- Donβt hammer servers - spread requests over time
- Use caching to avoid redundant requests
- Identify your bot with a descriptive User-Agent (if appropriate)
Ready to start? Generate proxies β