chaosnet 8ac129b337 Added config.yaml
Initial Commit - Alpha version
2025-03-10 22:46:20 -04:00
2025-03-10 22:45:12 -04:00
2025-03-10 22:46:20 -04:00
2025-03-10 22:43:15 -04:00

Agent Spice 🌶️

A web traffic generator designed to add entropy to your network metadata.

Overview

Agent Spice intelligently browses the internet, following realistic link patterns and generating network noise to obscure metadata analysis. It can:

  • Randomly visit websites.
  • Follow relevant links intelligently.
  • Perform periodic DNS queries to add entropy.
  • Use multiple user-agent profiles to simulate multiple users in a household.
  • Run within set hours to mimic realistic human browsing/sleep patterns.
  • Auto-update website lists from dynamic sources.

Why Use Agent Spice?

In an era of increasing surveillance, metadata analysis has become a powerful tool for tracking individuals. While most privacy-conscious users focus on encrypting their communications (VPNs, Tor, HTTPS), metadata—the details about who, when, and where someone communicates—remains highly revealing.

Agent Spice was designed to pollute metadata by generating realistic, yet unpredictable, internet traffic. By doing so, it obscures patterns that might otherwise expose a user's identity, behavior, or even personal habits.

How Agent Spice Help Enhance Privacy

  1. Prevents Behavioral Fingerprinting Many analytics platforms track online behavior based on the websites you visit, how frequently you visit them, and what times of day you are active. Agent Spice disrupts this tracking by browsing a wide range of sites, creating noise that dilutes behavioral profiles.

  2. Help Protects Against Network Traffic Analysis Even with encrypted traffic (VPN, Tor, HTTPS), network observers (ISPs, employers, governments) can still infer what you're doing online based on traffic patterns. Agent Spice introduces randomness, making it harder for observers to distinguish between real and artificial traffic.

  3. Obfuscates DNS Queries Every time you visit a website, your device sends a DNS query to resolve the domain name into an IP address. DNS logs are often used to build activity profiles on individuals. Agent Spice injects fake DNS queries to common resolvers (Google, Cloudflare, OpenDNS), ensuring that metadata-based tracking is unreliable.

  4. Defends Against IP-Based Profiling Many advertisers, security tools, and intelligence agencies analyze the browsing habits of devices connected to a single IP address (e.g., home router). Even without cookies or JavaScript tracking, they can infer user habits based on when and where traffic originates. Agent Spice makes your home network appear more dynamic, making it difficult to distinguish individual users.

Alternative Uses of Agent Spice

  1. Simulating Network Activity (Red Teaming & Cybersecurity) Cybersecurity professionals can use Agent Spice to test IDS/IPS systems by simulating realistic user behavior. Red Team operations can use it to generate background noise during reconnaissance and exploitation phases.
  2. Manipulating Targeted Advertising By directing traffic to competitor sites, political content, or irrelevant product pages, users can pollute their advertising profile. This disrupts ad networks that attempt to profile and manipulate users based on browsing habits.
  3. Misdirection for OSINT Investigations Intelligence agencies, private investigators, and cybercriminals often rely on OSINT (Open-Source Intelligence) to track individuals via public browsing patterns. Agent Spice creates misleading trails, making it harder to build accurate profiles.
  4. Creating Plausible Deniability If someone is wrongfully accused of visiting a site, the presence of Agent Spice logs can provide plausible deniability. Because Agent Spice follows real links dynamically, no one can prove whether a visit was intentional or part of automated traffic. Final Thoughts While no tool alone can guarantee perfect anonymity, Agent Spice adds a valuable layer of obfuscation to your digital footprint. When combined with Tor, VPNs, encrypted DNS, and other privacy measures, it makes metadata tracking significantly more difficult.

Installation

1. Install Required Dependencies

Agent Spice requires Python and some additional packages:

sudo apt update && sudo apt install -y python3 python3-pip
pip install requests beautifulsoup4 pyyaml

2. Choose Installation Directory

For best practices, install Agent Spice in /opt/agent_spice/:

sudo mkdir -p /opt/agent_spice/
sudo chown $USER:$USER /opt/agent_spice/
cd /opt/agent_spice/

3. Download and Extract

Move the Agent Spice package into /opt/agent_spice/ and unzip it:

unzip ~/Downloads/agent_spice_package.zip -d /opt/agent_spice/
cd /opt/agent_spice/

4. Set Up the Systemd Service

Move the provided systemd service file into the correct location:

sudo cp agent_spice.service /etc/systemd/system/

Then, reload systemd and enable the service:

sudo systemctl daemon-reload
sudo systemctl enable agent_spice.service
sudo systemctl start agent_spice.service

To check the status:

sudo systemctl status agent_spice.service

Configuration (config.yaml)

All customization is handled via config.yaml. Edit it using:

nano /opt/agent_spice/config.yaml

1. Multi-User-Agent Profiles

Agent Spice now supports multiple user-agent profiles, mimicking different users (e.g., household members).

Example:

user_agent_platform_1: "mac"
user_agent_browser_1: "firefox"

user_agent_platform_2: "ios"
user_agent_browser_2: "safari"

user_agent_platform_3: "windows"
user_agent_browser_3: "edge"

Agent Spice will randomly select one of these complete profiles before making requests.


2. Sleep Mode

To avoid browsing during certain hours (e.g., night), define a sleep window:

sleep_start_hour: 2  # 2 AM
sleep_end_hour: 7    # 7 AM

During these hours, Agent Spice will pause all activity.


3. Website Sources

You can combine static and dynamic site lists:

websites: true  # Use manually defined sites (set to true or list sites below)

fetch_dynamic_sites: true  # Fetch top sites dynamically
dynamic_sites_source: "https://tranco-list.eu/api/lists/daily_Cj7BZ/top-1000"

To manually define sites:

websites:
  - "https://www.wikipedia.org/"
  - "https://www.reddit.com/"
  - "https://www.nytimes.com/"

4. Interval Between Requests

interval_min: 10  # Minimum wait time (seconds)
interval_max: 60  # Maximum wait time (seconds)
randomize: true   # If false, uses interval_min

5. DNS Query Noise

enable_dns_queries: true
dns_servers:
  - "8.8.8.8"
  - "1.1.1.1"
  - "9.9.9.9"
  - "208.67.222.222"

6. Automatic Site List Updates

site_update_interval_hours: 24  # Refresh website list every 24 hours

Managing Agent Spice

  • Start:
    sudo systemctl start agent_spice.service
    
  • Stop:
    sudo systemctl stop agent_spice.service
    
  • Restart:
    sudo systemctl restart agent_spice.service
    
  • Check Logs:
    journalctl -u agent_spice.service -f
    

Uninstalling Agent Spice

To remove Agent Spice:

sudo systemctl stop agent_spice.service
sudo systemctl disable agent_spice.service
sudo rm /etc/systemd/system/agent_spice.service
sudo rm -rf /opt/agent_spice/
sudo systemctl daemon-reload

Future Enhancements

  • Expanding behavior to include app-like interactions (e.g., scrolling, clicking).
  • Implementing optional machine learning to detect real-world browsing trends. Maybe - scope creep sets in.

Agent Spice is now fully configurable, runs on systemd, and mimics real-world browsing behavior!

Description
Agent Spice was designed to pollute metadata by generating realistic, yet unpredictable, internet traffic. By doing so, it obscures patterns that might otherwise expose a user's identity, behavior, or even personal habits.
http://sygo.chaosnet.org/chaosnet.io/blog/agent-spice-web-metadata-entropy-generator/
Readme 34 KiB
Languages
Python 100%