Production-grade MCP server for universal reverse engineering automation.
Connect Claude Desktop, MCP-compatible IDEs, or custom tooling to a complete reverse engineering backend. One server, every RE tool, orchestrated through the Model Context Protocol.
- Features
- Quick Start
- IDE & Client Setup
- Configuration
- Tool Availability
- Architecture
- Security Model
- Testing
- Scripts & Automation
- Usage Examples
- Performance & Limitations
- Troubleshooting
- Contributing
- License
- Binary Parsing: PE/ELF/Mach-O via LIEF with hash computation and suspicious indicator detection
- Disassembly: Multi-backend support including Capstone (always available), radare2, and objdump for x86/x64/ARM/MIPS/RISC-V
- String Extraction: FLOSS integration, regex fallback, 17 classifier patterns (URLs, IPs, crypto, registry keys)
- Entropy Analysis: Shannon entropy with sliding window, per-section analysis, and packing detection
- Symbol Extraction: DWARF, PDB, LIEF universal; function prologue scanning for stripped binaries
- YARA Scanning: Inline rules, file/directory rules, and community rules support
- Capa Integration: ATT&CK mapping, MBC behaviors, capability enumeration
- Decompilation: Ghidra (headless), RetDec, Binary Ninja with caching
- GDB Adapter: Full GDB/MI protocol with breakpoints, stepping, registers, memory, backtrace, and heap inspection
- LLDB Adapter: Native SB API integration for macOS/Linux debugging
- Frida Adapter: Spawn/attach, script injection, function interception, memory scan/dump, and RPC exports
- Code Coverage: DynamoRIO drcov, Frida Stalker block tracing, and coverage analysis
- APK Parsing: Manifest extraction, permission analysis, component enumeration, and resource inspection
- DEX Analysis: Class/method listing, bytecode stats, and string extraction
- Decompilation: jadx/apktool integration, smali disassembly/assembly/patching
- Native Binary Analysis: ARM/AArch64 .so analysis with JNI detection
- Device Interaction: ADB bridge with 12 actions (logcat, install, shell, dumpsys, screenshot)
- Frida for Android: Root bypass, crypto hooking, SSL pinning bypass, API tracing, and memory dump
- Traffic Interception: tcpdump/mitmproxy integration with SSL key extraction
- Repack and Sign: APK rebuild with smali patches, zipalign + apksigner
- Security Scanners: MobSF, Quark-Engine, Semgrep, and manifest vulnerability detection
- Rizin/r2: Automated analysis with 13 actions and binary diffing
- GDB Enhanced: Heap analysis, ROP gadget finding, exploit helpers (pattern create/find, checksec)
- QEMU: User-mode emulation (4 actions) and full system emulation (5 actions)
- ROP Chain Builder: Multi-architecture gadget finding (x86/x64/ARM/ARM64) with semantic classification, automatic chain generation for execve/mprotect/syscalls, bad-char avoidance, and pwntools script generation
- Heap Exploitation: Malloc chunk analysis, bin classification (tcache/fastbin/smallbin/largebin), fake chunk generation, safe-linking encode/decode for glibc 2.32+, and technique templates (House of Force, Tcache Poisoning, Fastbin Dup, Unsafe Unlink)
- Libc Database: Symbol/offset extraction, libc identification from leaked addresses, ASLR defeat helpers (base calculation, GOT-to-libc, PLT-to-GOT), and one-gadget RCE finder
- Shellcode: Generation, encoding, bad-char analysis, extraction, and emulation testing
- Format String: Offset calculation, write payload generation, GOT overwrite, and address leaking
- Detection: Scan for anti-debug, anti-VM, anti-tamper, and packing indicators
- Bypass Generation: Frida/GDB/patch/LD_PRELOAD scripts for ptrace, IsDebuggerPresent, timing, and VM checks
- Triage: Multi-hash, IoC extraction, suspicious import scoring, and risk assessment
- Sandbox Queries: VirusTotal, Hybrid Analysis, and MalwareBazaar API integration
- YARA Generation: Auto-generate YARA rules from binary artifacts
- Config Extraction: C2 URLs, IPs, domains, encryption keys, and mutexes
- Extraction: binwalk scan/extract, entropy analysis, and filesystem identification
- Vulnerability Scanning: Hardcoded credentials, known CVEs, unsafe functions, and weak crypto
- Base Address Detection: String reference analysis for firmware base address recovery
- PCAP Analysis: tshark-based with 8 actions (summary, flows, DNS, HTTP, TLS, filter, export, IoC)
- Protocol Dissection: Binary structure inference, field boundary detection, and pattern analysis
- Protocol Fuzzing: Mutation-based, boundary testing, field-specific, and template fuzzing
- Packer Detection: UPX, Themida, VMProtect, ASPack, PECompact, MPRESS, and more
- UPX Unpacking: Static unpacking with automatic backup
- Dynamic Unpacking: Frida-based memory dump with OEP detection
- PE Rebuild: Fix section alignments, imports, and entry point after memory dump
- String Deobfuscation: XOR brute force, ROT variants, Base64, RC4, and stack string reconstruction
- Control Flow Flattening Detection: OLLVM-style CFF pattern identification
- Opaque Predicate Detection: Always-true/false branch identification
- angr Integration: Path exploration, constraint solving, CFG generation, and vulnerability scanning
- Triton DSE: Dynamic symbolic execution with concrete and symbolic state
- APK/DEX: Android analysis including manifest, permissions, native libs, and DEX parsing
- .NET IL: Assembly metadata, type/method listing, and IL disassembly
- Java Class: Class file parsing, javap integration, and bytecode disassembly
- WebAssembly: WASM section parsing, import/export extraction, and disassembly
- Hex Tools: Hexdump, pattern search (IDA-style wildcards), and binary diff
- Crypto: Hashing (MD5/SHA/TLSH/ssdeep), XOR analysis, and crypto constant scanning
- Patching: Binary patching with backup and NOP-sled support
- Network: PCAP analysis with protocol stats, DNS extraction, and C2 beacon detection
- Server Status: Version, tool count, cache stats, rate limit stats, and available tools
- Cache Management: View stats, clear cache, and invalidate specific entries
- Python 3.11 or later
- Linux recommended (macOS and WSL2 supported)
pip(oruv/pipxfor isolated installs)
# Clone
git clone https://github.com/president-xd/revula.git
cd revula
# Option 1: Automated install (recommended)
bash scripts/install/install_all.sh
# Option 2: Manual install
pip install -e .
# Option 3: Install with all optional dependencies
pip install -e ".[full]"
# Verify installation
python scripts/test/validate_install.pyThe automated installer handles Python version checks, virtual environment creation, dependency installation, external tool detection, and configuration file generation.
python -c "from revula.config import get_config, format_availability_report; print(format_availability_report(get_config()))"This prints a table showing which external tools and Python modules are detected on your system.
Revula can be run in Docker for isolated environments with all dependencies pre-configured:
# Build the Docker image
docker build -t revula:latest .
# Quick test
docker run --rm revula:latest --version
docker run --rm revula:latest --list-tools
# Run in stdio mode (for local MCP clients)
docker run -i --rm -v $(pwd)/workspace:/workspace revula:latest
# Revula transport is stdio-only (no HTTP/SSE mode)
# Run it attached to your MCP client process
# (for Docker usage, run your MCP client inside the same container/environment)What's included in the Docker image:
- All core Python dependencies (capstone, LIEF, pefile, yara)
- angr symbolic execution engine
- Frida dynamic instrumentation
- Ghidra headless analyzer
- GDB, radare2, rizin, binutils
- ADB and Android tools (apktool, jadx)
- FLARE tools (FLOSS, capa)
- Network analysis tools (tcpdump, tshark)
Testing the Docker build:
./scripts/docker/test.shFor complete Docker documentation including configuration, volumes, security, and troubleshooting, see DOCKER.md.
Note on Docker vs Native:
- Docker provides complete isolated environment with all tools pre-installed
- Native installation offers better performance and direct system access
- Choose based on your security and portability requirements
Revula uses stdio transport only. The server reads JSON-RPC from stdin and writes to stdout. Every MCP client listed below launches revula as a local subprocess. There is no HTTP server, no SSE endpoint, and no remote connection.
What this means for you:
- Revula must be installed on the same machine where your IDE/client runs.
- If you use a remote server or Docker, you must run both the client and revula inside the same environment (or use SSH piping; see Custom / Other Clients).
- Every client below uses the same
revulacommand. The only difference is where you put the config.
Make sure revula is installed and the command works:
# Should print the MCP protocol handshake (Ctrl+C to exit)
revula
# If you installed in a venv, activate it first:
source /path/to/venv/bin/activate
revula
# Or use the full path:
/path/to/venv/bin/revulaIf revula is not in your PATH, use the full path in every config below.
Status: Fully supported. This is the primary client.
Config file locations:
| Platform | Path |
|---|---|
| Linux | ~/.config/Claude/claude_desktop_config.json |
| macOS | ~/Library/Application Support/Claude/claude_desktop_config.json |
| Windows | %APPDATA%\Claude\claude_desktop_config.json |
| WSL2 | /mnt/c/Users/<YOU>/AppData/Roaming/Claude/claude_desktop_config.json |
Option A: Automatic setup (recommended)
python scripts/setup/setup_claude_desktop.pyThis auto-detects your OS, finds the config file, and merges the revula entry. It creates a backup first.
Option B: Manual setup
Add to your claude_desktop_config.json:
{
"mcpServers": {
"revula": {
"command": "revula",
"args": []
}
}
}If revula is in a virtualenv:
{
"mcpServers": {
"revula": {
"command": "/home/you/venvs/revula/bin/revula",
"args": []
}
}
}If using uvx (zero-install):
{
"mcpServers": {
"revula": {
"command": "uvx",
"args": ["revula"]
}
}
}After editing: Quit and reopen Claude Desktop. Check the MCP tools icon to confirm 116 tools are available.
Status: Fully supported.
Option A: CLI command (recommended)
claude mcp add revula -- revulaClaude Code will start revula as a subprocess when needed.
Option B: Manual config
Edit ~/.claude.json (or ~/.claude/settings.json depending on version):
{
"mcpServers": {
"revula": {
"command": "revula",
"args": []
}
}
}Status: Supported. Requires GitHub Copilot extension with MCP support (VS Code 1.99+).
Important: MCP support in VS Code is available through the GitHub Copilot Chat extension. Make sure you have:
- VS Code 1.99 or later
- GitHub Copilot extension installed and active
- MCP enabled in settings:
"chat.mcp.enabled": true
Option A: Workspace config (already included in this repo)
This repo ships with .vscode/mcp.json:
{
"servers": {
"revula": {
"command": "revula",
"args": [],
"env": {}
}
}
}Just open this project in VS Code and Copilot will discover the MCP server automatically.
Option B: User-level config (global, all projects)
Open VS Code settings (Ctrl+,) ā search "mcp" ā edit settings.json:
{
"chat.mcp.enabled": true,
"mcp": {
"servers": {
"revula": {
"command": "revula",
"args": [],
"env": {}
}
}
}
}Option C: Create .vscode/mcp.json in any project
Copy the file from this repo or create it manually:
mkdir -p .vscode
cat > .vscode/mcp.json << 'EOF'
{
"servers": {
"revula": {
"command": "revula",
"args": [],
"env": {}
}
}
}
EOFAfter editing: Reload VS Code window (Ctrl+Shift+P ā "Developer: Reload Window"). The MCP tools should appear in Copilot Chat.
Status: Supported. Cursor has built-in MCP support.
Config file: ~/.cursor/mcp.json (global) or .cursor/mcp.json (per-project).
This repo ships with .cursor/mcp.json for per-project use.
Option A: Per-project (already included)
The .cursor/mcp.json in this repo:
{
"mcpServers": {
"revula": {
"command": "revula",
"args": []
}
}
}Option B: Global config
mkdir -p ~/.cursor
cat > ~/.cursor/mcp.json << 'EOF'
{
"mcpServers": {
"revula": {
"command": "revula",
"args": []
}
}
}
EOFAfter editing: Restart Cursor. Check Settings ā MCP to verify revula appears.
Status: Supported. Windsurf Cascade supports MCP servers.
Config file: ~/.codeium/windsurf/mcp_config.json
mkdir -p ~/.codeium/windsurf
cat > ~/.codeium/windsurf/mcp_config.json << 'EOF'
{
"mcpServers": {
"revula": {
"command": "revula",
"args": []
}
}
}
EOFAfter editing: Restart Windsurf. The Cascade panel should show revula tools.
Status: Supported. Continue has MCP support in recent versions.
Config file: ~/.continue/config.json
Add to your existing config.json:
{
"mcpServers": [
{
"name": "revula",
"command": "revula",
"args": []
}
]
}If you use config.yaml:
mcpServers:
- name: revula
command: revula
args: []After editing: Restart your IDE. Continue should detect the MCP server.
Status: Supported. Zed has native MCP support via context servers.
Config file: ~/.config/zed/settings.json (Linux/macOS)
Add to your settings.json:
{
"context_servers": {
"revula": {
"command": "revula",
"args": []
}
}
}After editing: Restart Zed. The context server should appear in the Assistant panel.
Any MCP client that supports stdio transport will work with revula. The protocol is standard JSON-RPC over stdin/stdout.
Direct invocation:
# Start the server (reads from stdin, writes to stdout, logs to stderr)
revulaOver SSH (remote machine):
# Run revula on a remote machine with stdio piped through SSH
ssh user@remote-host revulaIn Docker:
FROM python:3.11-slim
RUN pip install revula
# The entrypoint speaks stdio MCP
ENTRYPOINT ["revula"]docker build -t revula .
# Use docker as the command in your client config:
# "command": "docker", "args": ["run", "-i", "--rm", "revula"]Python client (programmatic):
import asyncio
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
async def main():
server_params = StdioServerParameters(command="revula", args=[])
async with stdio_client(server_params) as (read, write):
async with ClientSession(read, write) as session:
await session.initialize()
tools = await session.list_tools()
print(f"Connected: {len(tools.tools)} tools available")
# Call a tool
result = await session.call_tool("re_entropy", {"binary_path": "/bin/ls"})
print(result)
asyncio.run(main())Configure any client with one command:
# Interactive: pick a client from the menu
python scripts/setup/setup_ide.py
# Configure a specific client
python scripts/setup/setup_ide.py --client vscode
python scripts/setup/setup_ide.py --client cursor
python scripts/setup/setup_ide.py --client claude-desktop
python scripts/setup/setup_ide.py --client windsurf
python scripts/setup/setup_ide.py --client zed
# Configure all detected clients at once
python scripts/setup/setup_ide.py --all
# Print all configs without writing files (review first)
python scripts/setup/setup_ide.py --print-only
# Override the command (e.g., full path to venv)
python scripts/setup/setup_ide.py --client cursor --command "/home/you/venv/bin/revula"The script auto-detects how to run revula (PATH, uvx, or python -m), creates backups before writing, and merges into existing configs.
Create ~/.revula/config.toml (or use the interactive generator):
python scripts/setup/setup_config_toml.pyExample configuration:
[tools.ghidra_headless]
path = "/opt/ghidra/support/analyzeHeadless"
[tools.radare2]
path = "/usr/bin/radare2"
[tools.jadx]
path = "/usr/local/bin/jadx"
[tools.retdec_decompiler]
path = "/usr/local/bin/retdec-decompiler"
[security]
max_memory_mb = 512
default_timeout = 60
max_timeout = 600
allowed_dirs = ["/home/user/samples", "/tmp/analysis"]Environment variables override config file values:
export GHIDRA_HEADLESS=/opt/ghidra/support/analyzeHeadless # Ghidra headless binary
export RETDEC_PATH=/usr/local/bin/retdec-decompiler # RetDec decompiler binary
export REVULA_DEFAULT_TIMEOUT=120 # Subprocess timeout (seconds)
export REVULA_MAX_MEMORY_MB=1024 # Memory limit (MB)revula degrades gracefully. Tools that depend on missing backends return clear error messages instead of crashing. Here is what each category needs:
| Category | Always Available | Needs External Tool | Needs Python Module |
|---|---|---|---|
| Static | PE/ELF parsing, entropy, strings | objdump, radare2, ghidra, retdec, floss, capa |
capstone ā, lief ā, pefile ā, yara ā |
| Dynamic | gdb, lldb |
frida |
|
| Android | APK manifest/DEX parsing (via zipfile) | jadx, apktool, adb, zipalign, apksigner, tcpdump |
frida, quark-engine |
| Platform | rizin, radare2, gdb, qemu-user, qemu-system-* |
r2pipe, binaryninja |
|
| Exploit | ROP chain builder, heap analysis, libc database, format string helpers | capstone ā, pwntools, keystone-engine |
|
| Anti-Analysis | Pattern scanning (via lief + capstone) |
||
| Malware | File hashing, IoC extraction, risk scoring | yara ā, ssdeep, tlsh |
|
| Firmware | binwalk, sasquatch |
||
| Protocol | Binary protocol dissection, fuzzing | tshark |
scapy |
| Unpacking | Packer signature detection | upx |
frida |
| Deobfuscation | XOR/ROT/Base64 deobfuscation | capstone ā |
|
| Symbolic | angr, triton |
||
| Binary Formats | aapt, javap, monodis, wasm2wat |
||
| Utilities | Hex dump, binary diff, patching | tshark |
scapy, ssdeep, tlsh |
ā = included in core dependencies (always installed).
# Frida (dynamic instrumentation)
pip install frida frida-tools
# angr (symbolic execution, large install ~2 GB)
pip install angr
# radare2 bindings
pip install r2pipe
# Fuzzy hashing
pip install ssdeep tlsh
# Network analysis
pip install scapy
# Everything at once
pip install -e ".[full]"# Core analysis
sudo apt install gdb binutils radare2 binwalk upx-ucl
# Android RE
sudo apt install apktool jadx android-sdk adb zipalign apksigner
# Network
sudo apt install tshark
# Ghidra: download from https://ghidra-sre.org/
export GHIDRA_INSTALL=/opt/ghidrasrc/revula/ # 19,400+ LOC across 63 Python files
āāā __init__.py # Version (__version__ = "0.1.0")
āāā config.py # Tool detection, TOML config, env var loading
āāā sandbox.py # Secure subprocess execution, path validation
āāā session.py # Session lifecycle manager (debuggers, Frida)
āāā server.py # MCP server entrypoint (stdio transport)
āāā cache.py # LRU result cache with TTL
āāā rate_limit.py # Token-bucket rate limiter
āāā tools/
āāā __init__.py # Tool registry + @register_tool decorator
āāā static/ # 8 files: PE/ELF, disasm, strings, entropy, symbols, YARA, capa, decompile
āāā dynamic/ # 4 files: GDB, LLDB, Frida, coverage
āāā android/ # 9 files: APK, DEX, decompile, native, device, frida, traffic, repack, scanners
āāā platform/ # 3 files: Rizin, GDB-enhanced, QEMU
āāā exploit/ # 5 files: ROP builder, heap exploitation, libc database, shellcode, format strings
āāā antianalysis/ # 1 file: anti-debug/VM detection and bypass generation
āāā malware/ # 1 file: triage, sandbox queries, YARA gen, config extraction
āāā firmware/ # 1 file: extraction, vuln scanning, base address detection
āāā protocol/ # 1 file: PCAP analysis, protocol dissection, fuzzing
āāā deobfuscation/ # 1 file: string deobfuscation, CFF, opaque predicates
āāā unpacking/ # 1 file: packer detection, UPX, dynamic unpack, PE rebuild
āāā symbolic/ # 1 file: angr + Triton
āāā binary_formats/ # 1 file: .NET, Java, WASM
āāā utils/ # 4 files: hex, crypto, patching, network
āāā admin/ # 1 file: server status, cache management
-
Startup.
server.pyloadsconfig.py, which probes the system for external tools (viashutil.which) and Python modules (viaimportlib.util.find_spec). Results are cached in aServerConfigsingleton. -
Tool Registration. Each tool file uses
@TOOL_REGISTRY.register()to declare its name, description, JSON Schema, and async handler. Tools self-register on import. -
Request Dispatch. When a
tools/callrequest arrives, the server looks up the handler inTOOL_REGISTRY, validates arguments against the JSON Schema, checks rate limits, checks the result cache, and dispatches to the handler. -
Subprocess Execution. All external tool invocations go through
sandbox.safe_subprocess(), which enforcesshell=False, setsRLIMIT_ASandRLIMIT_CPU, validates paths, and captures stdout/stderr. -
Result Caching. Deterministic operations (disassembly, parsing) are cached with a configurable TTL. Mutating operations (patching, Frida injection) bypass the cache automatically.
-
Session Management. Long-lived debugger and Frida sessions are tracked by
SessionManager, with automatic cleanup after 30 minutes of idle time.
| Component | Purpose | Key Detail |
|---|---|---|
| ResultCache | Avoid redundant subprocess calls | LRU, 256 entries, 10-minute TTL |
| RateLimiter | Prevent resource exhaustion | Token-bucket, 120 global / 30 per-tool RPM |
| ToolRegistry | Decorator-based tool dispatch | JSON Schema validation before handler call |
| SessionManager | Debugger/Frida persistence | Auto-cleanup after 30 min idle |
| sandbox.py | Secure execution layer | shell=False, RLIMIT enforcement, path validation |
revula operates on the principle that user-supplied arguments are untrusted. The following hardening measures are applied:
- No
shell=True: Every subprocess call usesshell=Falsewith explicit argument lists. This is enforced by a CI test (test_no_shell_true) that scans every source file. - No
eval()/exec(): No dynamic code evaluation of user input. - No f-string injection: User-supplied values are never interpolated into
python3 -ccode strings. Values are passed viasys.argv,stdin, or environment variables. Enforced bytest_no_fstring_in_subprocess_python_code. - JavaScript escaping: All user-controlled values interpolated into Frida JavaScript strings pass through
_js_escape(), which escapes backslashes, quotes, newlines, and other injection vectors. - Resource limits: Every subprocess gets
RLIMIT_AS(512 MB default) andRLIMIT_CPU(60 s default) viaresource.setrlimit(). - Timeout enforcement:
asyncio.wait_for()wraps all subprocess calls.
- Fail-closed:
validate_path()rejects all paths when noallowed_dirsare configured (falls back toget_config().security.allowed_dirs). It does not silently pass. - Traversal blocked:
..components are rejected afteros.path.realpath()resolution. - Absolute paths required: Relative paths are rejected.
- Validated everywhere: All file-accepting tool handlers call
validate_path()before any file I/O.
- Script size limit: Frida scripts are capped at 1 MB to prevent memory exhaustion.
- Memory dump limit: Memory dumps are capped at 100 MB.
- JS injection prevention: Class names, method names, module names, and other user-supplied values are escaped before interpolation into JavaScript templates.
- No
tempfile.mktemp(): All temporary files usetempfile.NamedTemporaryFile()ortempfile.mkdtemp()to prevent TOCTOU race conditions. - No hardcoded
/tmppaths: All temporary paths use thetempfilemodule.
Release History
| Version | Changes | Urgency | Date |
|---|---|---|---|
| main@2026-04-17 | Latest activity on main branch | High | 4/17/2026 |
| 0.0.0 | No release found ā using repo HEAD | High | 4/10/2026 |
