Compare commits

..

91 Commits
v0.5 ... v0.6

Author SHA1 Message Date
Aldo Cortesi
b9f8645258 Update trove classiviers in setup.py 2011-08-07 11:18:26 +12:00
Aldo Cortesi
2346a6d553 We now have a mailing list and an IRC channel
Come join us in the #mitmproxy channel on the OFTC IRC network
(irc://irc.oftc.net:6667).

We also have a mailing list, hosted here:

    http://groups.google.com/group/mitmproxy
2011-08-07 09:52:13 +12:00
Aldo Cortesi
f8719f13df 0.6 changelog 2011-08-07 09:41:39 +12:00
Aldo Cortesi
8309ab0ec8 Prep for 0.6 release.
- Update contributors file.
- Bump version number
- Include version number in docs
2011-08-06 21:19:22 +12:00
Aldo Cortesi
2e72b310d9 Remove script examples for now.
I need to come up with some uncontrived examples.
2011-08-06 21:10:45 +12:00
Aldo Cortesi
f0122f1403 API docs. 2011-08-06 13:19:46 +12:00
Aldo Cortesi
f23818ceea Add a "done" event for scripts.
Called exactly once after all other events.
2011-08-05 14:08:03 +12:00
Aldo Cortesi
cd0e2f18e6 First draft of scripting docs. 2011-08-05 13:26:39 +12:00
Aldo Cortesi
89a58d7e30 Start on scripting documentation and examples. 2011-08-05 10:47:43 +12:00
Aldo Cortesi
98a7aaca18 Documentation. 2011-08-05 10:23:32 +12:00
Aldo Cortesi
ce48cb4deb Make scripted rewriting of saved traffic work in mitmdump. 2011-08-05 09:41:29 +12:00
Aldo Cortesi
be133e7a0b More documentation stubbing. 2011-08-05 08:07:40 +12:00
Aldo Cortesi
ffd7043ee7 Update examples/stickycookies.py 2011-08-04 12:57:01 +12:00
Aldo Cortesi
87623a8d75 Rip out autodecode
We simplify things as follows:

    - If we're in "pretty" view mode, we autodecode.
    - Otherwise, we display raw data, and the user can manually encode/decode
    with z shortcut.
2011-08-04 10:54:42 +12:00
Aldo Cortesi
b51aac8a86 Code cleanliness - appease pychecker. 2011-08-04 10:34:34 +12:00
Aldo Cortesi
730c78ac53 Move script.Context to flow.ScriptContext 2011-08-04 10:14:44 +12:00
Aldo Cortesi
1662b8505b Clean pydoc profile for flow.Flow 2011-08-04 09:56:44 +12:00
Aldo Cortesi
8ef208a9e2 Clean pydoc profile for flow.Response, flow.Error 2011-08-04 09:44:48 +12:00
Aldo Cortesi
7a3b871b33 Request class now has a clean pydoc profile. 2011-08-04 09:26:26 +12:00
Aldo Cortesi
0760607a7d Further interface cleaning. 2011-08-03 23:02:33 +12:00
Aldo Cortesi
9042d3f3b9 Clean up interfaces by making some methods pseudo-private. 2011-08-03 22:48:57 +12:00
Aldo Cortesi
57c653be5f Move all HTTP objects to flow.py
That's Request, Response, ClientConnect, ClientDisconnect, Error, and Headers.
2011-08-03 22:41:38 +12:00
Aldo Cortesi
cbd8d09849 Miscelaneous doc updates.
- Make typography less hideous.
    - Stub out the docs I'm about to write.
2011-08-03 20:28:52 +12:00
Aldo Cortesi
9d0e3c8d61 Doc and help adjustments. 2011-08-03 19:15:01 +12:00
Aldo Cortesi
028d5bacc5 Make "C" clear eventlog when it has focus. 2011-08-03 17:41:13 +12:00
Aldo Cortesi
e337682d8e Enable "|" command to run a oneshot script on a single flow. 2011-08-03 17:35:18 +12:00
Aldo Cortesi
cfc6e8777e Add script set/unset shortcut for mitmproxy. 2011-08-03 17:14:11 +12:00
Aldo Cortesi
e3196dac4d Move commands around to make space for "s" script shortcut.
New commands are:

"w" - save all flows
"W" - save this flow
"S" - server replay
2011-08-03 16:52:41 +12:00
Aldo Cortesi
179cf75862 Add script hooks, enable new engine for mitmdump. 2011-08-03 16:36:20 +12:00
Aldo Cortesi
f7e4e89b12 Move the event notification mechanism into flow.py 2011-08-03 13:33:18 +12:00
Aldo Cortesi
12d2b1f926 Rip out old script interface, start replacing with new stubs.
Scripts are broken for now.
2011-08-03 13:20:36 +12:00
Aldo Cortesi
62088a6661 Start stubbing out a much more powerful script architecture. 2011-08-03 11:06:29 +12:00
Aldo Cortesi
a817db5bd6 Refresh current connection when toggling autodecode.
Also fix the unit tests I forgot to commit...
2011-08-02 20:47:53 +12:00
Aldo Cortesi
8cc0469ee7 Tweak encoding behaviour
- Don't fail to identity encoding when an unknown encoding is specified.
- Don't constrain encodings. I want to try to modify traffic as little as
possible by default.
- When decoding, delete content-encoding header rather than set it to "identity"
- Refuse to decode/encode when there is an existing but unknown
content-encoding header.
2011-08-02 20:42:46 +12:00
Aldo Cortesi
bb6ec29b18 Fix encoding import crash. 2011-08-02 16:55:54 +12:00
Aldo Cortesi
1ff6a767d0 Unit test++ 2011-08-02 16:52:47 +12:00
Aldo Cortesi
357502fe03 General cleanup.
Cut out unused variables and code, generally shut up pychecker as much as is
reasonable.
2011-08-02 16:14:33 +12:00
Aldo Cortesi
17835b9b78 Fix a rare undefined variable crash in proxy.py. 2011-08-02 15:43:35 +12:00
Aldo Cortesi
a1456742a8 Make ConnectionList key bindings work even if there are no entries. 2011-08-02 15:35:54 +12:00
Aldo Cortesi
f3742f29da We no longer need to track clientconnections. 2011-08-02 14:56:09 +12:00
Aldo Cortesi
f3f8462ddc Make the mitmproxy eventlog display useful information. 2011-08-02 14:17:15 +12:00
Aldo Cortesi
73a7d893e3 Give a visual indication that the eventlog has focus. 2011-08-02 11:02:23 +12:00
Aldo Cortesi
759f5d71a6 Initial key bindings and event handlers for event log. 2011-08-02 10:48:29 +12:00
Aldo Cortesi
af92153974 Start stubbing out a UI for the eventlog in mtimproxy. 2011-08-02 09:17:54 +12:00
Aldo Cortesi
9b398c03ab Exit with error if mitmproxy can't load a file specified on cmdline. 2011-08-01 13:27:46 +12:00
Aldo Cortesi
675b3133b4 Improve performance of loading flows from a file hugely.
Fell into the "expensive __eq__ method" trap. Oh, Python, you little scamp.
2011-08-01 11:26:09 +12:00
Aldo Cortesi
43f1c72511 Refactor the way we calculate views of the flow list.
The naive approach we used before recalculated the view on every access, and
consequently had serious performance problems.
2011-08-01 11:17:01 +12:00
Aldo Cortesi
ddb5748a76 Add decoding/encoding for requests. 2011-08-01 10:43:01 +12:00
Aldo Cortesi
c89c4361c3 Merge remote-tracking branch 'alts/encoding' 2011-07-28 11:19:07 +12:00
Stephen Altamirano
78049abac1 Changes replace logic to function in both Python 2.6.x and 2.7.x
Tests now only assume Python 2.6.x rather than requiring 2.7.x. This does not preclude the use of flags as a kwarg in replace
2011-07-26 22:47:08 -07:00
Stephen Altamirano
acd511f676 Adds test method for Response encoding and decoding 2011-07-26 22:38:42 -07:00
Stephen Altamirano
c1eaa9f74c Adds encode and decode methods to Response objects 2011-07-26 22:03:41 -07:00
Aldo Cortesi
e6288e2d07 Fix crash when sticky cookies are read from file.
Cookielib expects strings, not unicode.
2011-07-24 16:08:27 +12:00
Aldo Cortesi
0f4ae61e7d Fix a crash in mitmdump event display. 2011-07-23 16:59:48 +12:00
Aldo Cortesi
6cd32bf96f Unbreak mitmproxy. Oops. 2011-07-23 13:39:17 +12:00
Aldo Cortesi
3648c7953a Extend eventlog information.
Also, squash an SSL-related bug revealed by the extended logging.
2011-07-23 13:37:06 +12:00
Aldo Cortesi
4043829cf2 Add an eventlog option to mitmdump
This shows client connections, disconnections and requests (before a complete
flow is assembled). We need to add an analogous display to mitmproxy.
2011-07-23 12:57:54 +12:00
Aldo Cortesi
689f5f0d1f Don't turn off output if -v flag is passed more than twice. 2011-07-23 11:50:30 +12:00
Aldo Cortesi
47e1695512 Also replace strings path for requests. 2011-07-22 20:52:13 +12:00
Aldo Cortesi
6ce8b49e05 Make script pipe globally available. 2011-07-22 19:09:32 +12:00
Aldo Cortesi
1b961fc4ad Add utility functions to search and replace strings in flows
This is a common task in pentesting scenarios. This commit adds the following
functions:

utils.Headers.replace
proxy.Request.replace
proxy.Response.replace
flow.Flow.replace
2011-07-22 17:48:42 +12:00
Stephen Altamirano
9c24401b18 Removes last_encoding attribute from Response. Prompts for encoding on identity responses 2011-07-21 22:09:48 -07:00
Stephen Altamirano
74d8b18408 Removes should_autodecode attribute from Response. Adds commandline option 'd' to toggle autodecode, adds togglable option 'd' to do the same 2011-07-21 20:22:13 -07:00
Aldo Cortesi
5936a48e59 Drop cert expiry time to avoid a bug in some OpenSSL versions. 2011-07-22 11:11:45 +12:00
Stephen Altamirano
aa7f8ac90b Switches hotkeys. En/decode is now bound to 'z', kill connection now 'X' 2011-07-20 00:14:24 -07:00
Stephen Altamirano
ebfa9b2a5d Fixes issue #10 regarding broken json printing 2011-07-19 10:51:49 -07:00
Aldo Cortesi
5d6f855387 Merge pull request #9 from alts/toggle_encodings
Toggle encodings
2011-07-18 22:26:14 -07:00
Stephen Altamirano
25b0631190 Switches hotkey to unused 'g', adds help message 2011-07-18 22:04:23 -07:00
Stephen Altamirano
1c5434d72c Adds ability to toggle between encodings in the response view 2011-07-18 21:52:40 -07:00
Stephen Altamirano
ecd4645988 Adds encode counterparts to decode functions 2011-07-17 20:16:47 -07:00
Aldo Cortesi
b0849387b7 Add explicit notice when data has been auto-decoded. 2011-07-18 14:18:47 +12:00
Aldo Cortesi
669ce8ee7c Correctly detect urlencoded data
This broke when we introduced case preservation for headers.
2011-07-17 15:31:58 +12:00
Aldo Cortesi
6df4be93e3 Fix error in anticomp commandline specification. 2011-07-17 14:36:38 +12:00
Aldo Cortesi
f756d3bec1 Make help display for options nicer. 2011-07-17 11:14:18 +12:00
Aldo Cortesi
1559ded009 Expose the anticompression flag as an option through the "o" key in mitmproxy. 2011-07-17 10:34:43 +12:00
Aldo Cortesi
ce41046786 Refine encoding support
- Push decoding down into the LRU cache
- Cope gracefully with corrupted data and incorrect encoding headers
2011-07-17 10:25:25 +12:00
alts
7ec03e45a5 Removes item from todo list 2011-07-16 02:48:04 -07:00
alts
6dc0f105cc Adds support for content encoding, namely gip and deflate 2011-07-16 02:47:06 -07:00
Aldo Cortesi
94ae720a22 Add a pretty-printing mode for urlencoded form data. 2011-07-15 16:46:54 +12:00
Aldo Cortesi
76b4c6ba82 Introduce an anti-compression command-line argument.
This is on by default, which means we avoid compressed content unless the -z
flag is specified.
2011-07-15 15:24:56 +12:00
Aldo Cortesi
1a963b91bb Don't sort headers when displaying them in mitmproxy. 2011-07-14 17:20:32 +12:00
Aldo Cortesi
7e21ac0eb8 Refine path completion somewhat.
Make it match the behaviour of vim and mutt more closely
2011-07-14 16:54:04 +12:00
Aldo Cortesi
1c9e7b982a Rewrite Headers object to preserve order and case. 2011-07-14 16:01:54 +12:00
Aldo Cortesi
b6e1bf63c3 Merge branch 'master' of github.com:cortesi/mitmproxy 2011-07-11 16:14:32 +12:00
Aldo Cortesi
76f83d7763 Merge pull request #4 from fwolfst/master
cert-wait-time argument type specification fix.
2011-07-10 21:14:18 -07:00
Felix Wolfsteller
1a5b157c8f Specify certificate creation waiting time argument to fix type error in proxy.py . 2011-07-09 08:07:22 +00:00
Aldo Cortesi
65fbb7bd0d Bail out if no command was specified after | shortcut. 2011-07-08 21:41:00 +12:00
Aldo Cortesi
8e176c2086 Cast some data read from dump files to str, to prevent unicode promotion.
This fixes a bug that caused a traceback when de-serialized requests were
replayed. Also adds unit tests for the problem.
2011-07-01 14:20:42 +12:00
Aldo Cortesi
2a90ea69fd Show view mode in statusbar.
Also make "m" keyboard shortcut available globally.
2011-06-30 14:49:11 +12:00
Aldo Cortesi
37c8d3425d Fix edit prompt display, return code editing. 2011-06-30 14:30:48 +12:00
Aldo Cortesi
18d4c3a9e9 JSON pretty-printing.
Also rename the display modes ("pretty" instead of "indent"), and expand the
built-in documentation.
2011-06-30 13:27:27 +12:00
58 changed files with 2733 additions and 1639 deletions

View File

@@ -1,3 +1,36 @@
7 August 2011: mitmproxy 0.6:
* New scripting API that allows much more flexible and fine-grained
rewriting of traffic. See the docs for more info.
* Support for gzip and deflate content encodings. A new "z"
keybinding in mitmproxy to let us quickly encode and decode content, plus
automatic decoding for the "pretty" view mode.
* An event log, viewable with the "v" shortcut in mitmproxy, and the
"-e" command-line flag in mitmdump.
* Huge performance improvements: mitmproxy interface, loading
large numbers of flows from file.
* A new "replace" convenience method for all flow objects, that does a
universal regex-based string replacement.
* Header management has been rewritten to maintain both case and order.
* Improved stability for SSL interception.
* Default expiry time on generated SSL certs has been dropped to avoid an
OpenSSL overflow bug that caused certificates to expire in the distant
past on some systems.
* A "pretty" view mode for JSON and form submission data.
* Expanded documentation and examples.
* Countless other small improvements and bugfixes.
27 June 2011: mitmproxy 0.5:
* An -n option to start the tools without binding to a proxy port.

View File

@@ -1,5 +1,8 @@
203 Aldo Cortesi
278 Aldo Cortesi
18 Henrik Nordstrom
13 Thomas Roth
10 Stephen Altamirano
2 alts
1 Yuangxuan Wang
1 Henrik Nordström
1 Felix Wolfsteller

View File

@@ -1,12 +1,13 @@
__mitmproxy__ is an SSL-capable, intercepting HTTP proxy. It provides a console
interface that allows traffic flows to be inspected and edited on the fly.
__mitmproxy__ is an SSL-capable man-in-the-middle proxy for HTTP. It provides a
console interface that allows traffic flows to be inspected and edited on the
fly.
__mitmdump__ is the command-line version of mitmproxy, with the same
functionality but without the frills. Think tcpdump for HTTP.
Both tools are fully documentented in the commandline __--help__ flag, and, in
the case of __mitmproxy__, a built-in help page accessible through the __?__
Both tools are fully documentented in the commandline _--help_ flag, and, in
the case of __mitmproxy__, a built-in help page accessible through the _?_
keyboard shortcut.
@@ -33,11 +34,22 @@ Source is hosted on github:
[github.com/cortesi/mitmproxy](http://github.com/cortesi/mitmproxy)
Community
---------
Come join us in the #mitmproxy channel on the OFTC IRC network
(irc://irc.oftc.net:6667).
We also have a mailing list, hosted here:
http://groups.google.com/group/mitmproxy
Requirements
------------
* [Python](http://www.python.org) 2.6.x or 2.7.x.
* [openssl](http://www.openssl.org/). Installed by default on most systems.
* [openssl](http://www.openssl.org/) - installed by default on most systems.
* [urwid](http://excess.org/urwid/) version 0.9.8 or newer.
* The test suite uses the [pry](http://github.com/cortesi/pry) unit testing
library.

View File

@@ -8,14 +8,14 @@ a {
#hd.doc {
-x-system-font:none;
font-family: Helvetica,Arial,Tahoma,Verdana,Sans-Serif;
color: #555555;
color: #444444;
margin: 0;
margin-bottom: 1em;
}
#hd.doc h1 {
letter-spacing: 3px;
font-size: 2.5em;
line-height: 100%;
margin: 0.3em 0;
font-weight: normal;
}
@@ -26,26 +26,46 @@ a {
#bd {
-x-system-font:none;
font-family: Helvetica,Arial,Tahoma,Verdana,Sans-Serif;
font-size: 1.6em;
color: #555555;
font: 16px/21px "HelveticaNeue","Helvetica Neue",Helvetica,Arial,sans-serif;
line-height: 1.5;
color: #444444;
}
#bd h1, #bd h2, #bd h3 {
font-family: "Georgia","Times New Roman",Helvetica,Arial,sans-serif;
font-weight: normal;
color: #181818;
}
#bd h1 {
font-size: 1.4em;
border-bottom: 5px solid #ff7033;
font-size: 1.9em;
border-bottom: 2px solid #ff7033;
margin-top: 5px;
margin-bottom: 5px;
color: #000000;
}
#bd h2 {
font-size: 1.1em;
font-size: 1.4em;
border-bottom: 1px solid #cccccc;
margin-top: 5px;
margin-bottom: 5px;
color: #000000;
}
#bd h3 {
margin-bottom: 0px;
}
#bd p {
margin: 1em 0;
margin-top: 0.5em;
}
/* Keyboard shortcuts */
#bd em {
font-weight: bold;
color: #00A700;
font-style: normal;
}
#ft.doc {
@@ -63,18 +83,22 @@ a {
}
pre {
font-size: 0.9em;
line-height: 1.4;
padding: 10px;
background-color: #dddddd;
margin: 1em 0;
}
.terminal {
color: #ffffff;
color: #c0c0c0;
font-size: 1em;
background: #000000;
}
.docindex, .docindex ul {
.docindex {
font-size: 1.3em;
line-height: 1.3;
margin-top: 0.1em;
margin-bottom: 0;
margin-left: 0px;
@@ -87,6 +111,8 @@ pre {
.docindex ul {
margin-left: 2em;
margin-top: 0.1em;
margin-bottom: 0.1em;
}
li a {
@@ -96,3 +122,16 @@ li a {
.highlight {
font-size: 14px;
}
.example_legend{
line-height: 1;
font-size: 12px;
}
.example pre {
margin: 0;
}
.kvtable th {
text-align: left;
white-space: nowrap;
}

View File

@@ -14,7 +14,7 @@
</div>
<!--(end)-->
$!nav if this.title!="docs" else ""!$
<h1><a href="@!urlTo("/index.html")!@">mitmproxy docs</a></h1>
<h1><a href="@!urlTo("/index.html")!@">mitmproxy 0.6 docs</a></h1>
</div>
<div id="bd">
<div id="yui-main">

View File

@@ -10,7 +10,7 @@
</ul>
</div>
<br>
<p>an SSL-capable intercepting proxy</p>
<p>an SSL-capable man-in-the-middle proxy</p>
</div>
<div id="bd">
<div id="yui-main">
@@ -29,7 +29,7 @@
</div>
<!--(end)-->
$!nav if this.title!="docs" else ""!$
$!title if this.title!="docs" else "<h1>mitmproxy docs</h1>"!$
$!title if this.title!="docs" else "<h1>mitmproxy 0.6 docs</h1>"!$
$!body!$
</div>
</div>

View File

@@ -1,4 +1,7 @@
- command-line: _--anticache_
- mitmproxy shortcut: _o_, then _a_
When the __anticache__ option is passed to mitmproxy, it removes headers
(__if-none-match__ and __if-modified-since__) that might elicit a
304-not-modified response from the server. This is useful when you want to make

View File

@@ -1,11 +1,14 @@
- command-line: _-c path_
- mitmproxy shortcut: _c_
Client-side replay does what it says on the tin: you provide a previously saved
HTTP conversation, and mitmproxy replays the client requests one by one. Note
that mitmproxy serializes the requests, waiting for a response from the server
before starting the next request. This might differ from the recorded
conversation, where requests may have been made concurrently.
You may want to use client-side replay in conjunction with the __anticache__
option. This will modify requests to remove headers (e.g. if-modified-since)
that might cause a server to reply with a 304-not-modified.
You may want to use client-side replay in conjunction with the
[anticache](@!urlTo("anticache.html")!@) option.

View File

@@ -1,14 +1,18 @@
<ul class="docindex">
<li><a href="@!urlTo("intro.html")!@">Introduction</a></li>
<li><a href="@!urlTo("interception.html")!@">Interception</a></li>
<li><a href="@!urlTo("clientreplay.html")!@">Client-side replay</a></li>
<li><a href="@!urlTo("serverreplay.html")!@">Server-side replay</a></li>
<li><a href="@!urlTo("sticky.html")!@">Sticky cookies and auth</a></li>
<li><a href="@!urlTo("anticache.html")!@">Anticache</a></li>
<li><a href="@!urlTo("filters.html")!@">Filter expressions</a></li>
<li><a href="@!urlTo("scripts.html")!@">Scripting API</a></li>
<li><a href="@!urlTo("ssl.html")!@">SSL</a></li>
<li><a href="@!urlTo("mitmproxy.html")!@">mitmproxy</a></li>
<li><a href="@!urlTo("mitmdump.html")!@">mitmdump</a></li>
<li>Concepts</li>
<ul>
<li><a href="@!urlTo("clientreplay.html")!@">Client-side replay</a></li>
<li><a href="@!urlTo("serverreplay.html")!@">Server-side replay</a></li>
<li><a href="@!urlTo("sticky.html")!@">Sticky cookies and auth</a></li>
<li><a href="@!urlTo("anticache.html")!@">Anticache</a></li>
<li><a href="@!urlTo("filters.html")!@">Filter expressions</a></li>
</ul>
<li><a href="@!urlTo("scripts.html")!@">Scripts</a></li>
<li><a href="@!urlTo("ssl.html")!@">SSL interception</a></li>
<ul>
<li><a href="@!urlTo("certinstall/firefox.html")!@">Firefox</a></li>
<li><a href="@!urlTo("certinstall/osx.html")!@">OSX</a></li>

View File

@@ -13,7 +13,7 @@ if ns.options.website:
else:
ns.title = countershape.template.Template(None, "<h1>@!this.title!@</h1>")
this.layout = countershape.Layout("_layout.html")
ns.docTitle = "mitmproxy"
this.markup = markup.Markdown()
ns.docMaintainer = "Aldo Cortesi"
ns.docMaintainerEmail = "aldo@corte.si"
@@ -33,7 +33,8 @@ ns.index_contents = file(mpath("README.mkd")).read()
top = os.path.abspath(os.getcwd())
def example(s):
d = file(mpath(s)).read()
return countershape.template.Syntax("py")(d)
extemp = """<div class="example">%s<div class="example_legend">(%s)</div></div>"""
return extemp%(countershape.template.Syntax("py")(d), s)
ns.example = example
@@ -67,14 +68,15 @@ ns.filt_help = filt_help
pages = [
Page("index.html", "docs"),
Page("intro.html", "Introduction"),
Page("interception.html", "Interception"),
Page("mitmproxy.html", "mitmproxy"),
Page("mitmdump.html", "mitmdump"),
Page("clientreplay.html", "Client-side replay"),
Page("serverreplay.html", "Server-side replay"),
Page("sticky.html", "Sticky cookies and auth"),
Page("anticache.html", "Anticache"),
Page("filters.html", "Filter expressions"),
Page("scripts.html", "External scripts"),
Page("ssl.html", "SSL"),
Page("scripts.html", "Scripts"),
Page("ssl.html", "SSL interception"),
Directory("certinstall"),
Page("library.html", "libmproxy: mitmproxy as a library"),
Directory("tutorials"),

View File

@@ -1,38 +0,0 @@
__mitmproxy__'s interception functionality lets you pause an HTTP request or
response, inspect and modify it, and then accept it to send it on to the server
or client. Because this is an interactive function, it's only present in
__mitmproxy__, not in __mitmdump__.
### 1: Set an interception pattern
<img src="@!urlTo('intercept-filt.png')!@"/>
We press __i__ to set an interception pattern. In this case, the __~q__ filter
pattern tells __mitmproxy__ to intercept all requests. For complete filter
syntax, see the [Filter expressions](@!urlTo("filters.html")!@) section of this
document, or the built-in help function in __mitmproxy__.
### 2: Intercepted connections are indicated with a red exclamation mark:
<img src="@!urlTo('intercept-mid.png')!@"/>
### 3: You can now view and modify the request:
<img src="@!urlTo('intercept-options.png')!@"/>
In this case, we viewed the request by selecting it, pressed __e__ for "edit"
and __m__ for "method" to change the HTTP request method.
### 4: Accept the intercept to continue
<img src="@!urlTo('intercept-result.png')!@"/>
Finally, we press __a__ to accept the modified request, which is then sent on
to the server. In this case, we changed the request from an HTTP GET to to
OPTIONS, and Google's server has responded with a 405 "Method not allowed".

View File

@@ -8,5 +8,5 @@ this lets you log in to a site using your browser, and then make subsequent
requests using a tool like __curl__, which will then seem to be part of the
authenticated session.
$!example("examples/stickycookies.py")!$
$!example("examples/stickycookies")!$

54
doc-src/mitmdump.html Normal file
View File

@@ -0,0 +1,54 @@
__mitmdump__ is the command-line companion to mitmproxy. It provides
tcpdump-like functionality to let you view, record, and programmatically
transform HTTP traffic. See the _--help_ flag ouptut for complete
documentation.
## Example: saving traffic
<pre class="terminal">
> mitmdump -w outfile
</pre>
Start up mitmdump in proxy mode, and write all traffic to __outfile__.
## Example: client replay
<pre class="terminal">
> mitmdump -nc outfile
</pre>
Start mitmdump without binding to the proxy port (_-n_), then replay all
requests from outfile (_-c filename_). Flags combine in the obvious way, so
you can replay requests from one file, and write the resulting flows to
another:
<pre class="terminal">
> mitmdump -nc srcfile -w dstfile
</pre>
See the [Client-side Replay](@!urlTo("clientreplay.html")!@) section for more information.
## Example: running a script
<pre class="terminal">
> mitmdump -s examples/add_header.py
</pre>
This runs the __add_header.py__ example script, which simply adds a new header
to all responses.
## Example: scripted data transformation
<pre class="terminal">
> mitmdump -ns examples/add_header.py -r srcfile -w dstfile
</pre>
This command loads flows from __srcfile__, transforms it according to the
specified script, then writes it back to __dstfile__.

58
doc-src/mitmproxy.html Normal file
View File

@@ -0,0 +1,58 @@
__mitmproxy__ is a console tool that allows interactive examination and
modification of HTTP traffic. The _?_ shortcut key shows complete documentation
on __mitmproxy__'s functionality.
## The interface: connection list
<img src="@!urlTo("screenshots/mitmproxy.png")!@"/>
The connection list shows an index of captured flows in chronological order.
So, in this case, we can we can see that we visited __gmail.com__, which then
returned a 301 redirect to mail.google.com.
The statusbar at the bottom tells us that there are 11 flows in the view, that
we are using the "pretty" view mode (more on that below), and that the proxy is
bound to port 8080 of all interfaces.
Also visible is the __Event log__, which can be toggled on and off with the _v_
keyboard shortcut. This displays events like client connection information,
errors, and script output.
## Example: Interception
__mitmproxy__'s interception functionality lets you pause an HTTP request or
response, inspect and modify it, and then accept it to send it on to the server
or client.
### 1: Set an interception pattern
<img src="@!urlTo('intercept-filt.png')!@"/>
We press _i_ to set an interception pattern. In this case, the __~q__ filter
pattern tells __mitmproxy__ to intercept all requests. For complete filter
syntax, see the [Filter expressions](@!urlTo("filters.html")!@) section of this
document, or the built-in help function in __mitmproxy__.
### 2: Intercepted connections are indicated with a red exclamation mark:
<img src="@!urlTo('intercept-mid.png')!@"/>
### 3: You can now view and modify the request:
<img src="@!urlTo('intercept-options.png')!@"/>
In this case, we viewed the request by selecting it, pressed _e_ for "edit"
and _m_ for "method" to change the HTTP request method.
### 4: Accept the intercept to continue
<img src="@!urlTo('intercept-result.png')!@"/>
Finally, we press _a_ to accept the modified request, which is then sent on to
the server. In this case, we changed the request from an HTTP GET to to
OPTIONS, and Google's server has responded with a 405 "Method not allowed".

Binary file not shown.

After

Width:  |  Height:  |  Size: 171 KiB

View File

@@ -1,15 +1,121 @@
Both __mitmproxy__ and __mitmdump__ allow you to modify requests and responses
with external scripts. This is often done through the __--reqscript__ and
__--respscript__ options
__mitmproxy__ has a powerful scripting API that allows you to modify flows
on-the-fly or rewrite previously saved flows locally.
The mitmproxy scripting API is event driven - a script is simply a Python
module that exposes a set of event methods. Here's a complete mitmproxy script
that adds a new header to every HTTP response before it is returned to the
client:
$!example("examples/add_header.py")!$
The first argument to each event method is an instance of ScriptContext that
lets the script interact with the global mitmproxy state. The __response__
event also gets an instance of Flow, which we can use to manipulate the
response itself.
The script interface is simple - scripts simply read,
modify and return a single __libmproxy.flow.Flow__ object, using the methods
defined in the __libmproxy.script__ module. Scripts must be executable.
## Events
$!example("examples/simple_script")!$
### start(ScriptContext)
Called once on startup, before any other events.
###clientconnect(ScriptContext, ClientConnect)
Called when a client initiates a connection to the proxy. Note that
a connection can correspond to multiple HTTP requests.
###request(ScriptContext, Flow)
Called when a client request has been received. The __Flow__ object is
guaranteed to have a non-None __request__ attribute.
### response(ScriptContext, Flow)
Called when a server response has been received. The __Flow__ object is
guaranteed to have non-None __request__ and __response__ attributes.
### error(ScriptContext, Flow)
Called when a flow error has occured, e.g. invalid server responses, or
interrupted connections. This is distinct from a valid server HTTP error
response, which is simply a response with an HTTP error code. The __Flow__
object is guaranteed to have non-None __request__ and __error__ attributes.
### clientdisconnect(ScriptContext, ClientDisconnect)
Called when a client disconnects from the proxy.
### done(ScriptContext)
Called once on script shutdown, after any other events.
## API
The main classes you will deal with in writing mitmproxy scripts are:
<table class="kvtable">
<tr>
<th>libmproxy.flow.ScriptContext</th>
<td>A handle for interacting with mitmproxy's global state.</td>
</tr>
<tr>
<th>libmproxy.flow.Flow</th>
<td>A collection of objects representing a single HTTP transaction.</td>
</tr>
<tr>
<th>libmproxy.flow.Request</th>
<td>An HTTP request.</td>
</tr>
<tr>
<th>libmproxy.flow.Response</th>
<td>An HTTP response.</td>
</tr>
<tr>
<th>libmproxy.flow.Error</th>
<td>A communications error.</td>
</tr>
<tr>
<th>libmproxy.flow.ClientConnection</th>
<td>Describes a client connection.</td>
</tr>
<tr>
<th>libmproxy.flow.ClientDisconnection</th>
<td>Describes a client disconnection.</td>
</tr>
<tr>
<th>libmproxy.flow.Headers</th>
<td>HTTP headers for a request or response.</td>
</tr>
</table>
The canonical API documentation is the code. You can view the API documentation
using pydoc (which is installed with Python by default), like this:
<pre class="terminal">
> pydoc libmproxy.flow.Request
</pre>
## Running scripts on saved flows
Sometimes, we want to run a script on __Flow__ objects that are already
complete. This happens when you start a script, and then load a saved set of
flows from a file (see the "scripted data transformation" example on the
[mitmdump](@!urlTo("mitmdump.html")!@) page). It also happens when you run a
one-shot script on a single flow through the _|_ (pipe) shortcut in mitmproxy.
In this case, there are no client connections, and the events are run in the
following order: __start__, __request__, __response__, __error__, __done__. If
the flow doesn't have a __response__ or __error__ associated with it, the
matching event will be skipped.

View File

@@ -1,4 +1,7 @@
- command-line: _-S path_
- mitmproxy shortcut: _S_
Server-side replay lets us replay server responses from a saved HTTP
conversation.
@@ -8,10 +11,9 @@ Matching requests with responses
By default, __mitmproxy__ excludes request headers when matching incoming
requests with responses from the replay file. This works in most circumstances,
and makes it possible to replay server responses in situations where request
headers would naturally vary, e.g. using a different user agent. The
__--rheader__ option to both __mitmproxy__ and __mitmdump__ allows you to
override this behaviour by specifying individual headers that should be
included in matching.
headers would naturally vary, e.g. using a different user agent. The _--rheader
headername_ command-line option allows you to override this behaviour by
specifying individual headers that should be included in matching.
Response refreshing
@@ -27,8 +29,7 @@ recording. So, if they were in the past at the time of recording, they will be
in the past at the time of replay, and vice versa. Cookie expiry times are
updated in a similar way.
You can turn off response refreshing using the __norefresh__ option, available
both on the command-line and using the "options" keyboard shortcut within
__mitmproxy__.
You can turn off response refreshing using the _--norefresh_ argument, or using
the _o_ options shortcut within __mitmproxy__.

View File

@@ -1,5 +1,11 @@
When the __stickycookie__ option is set, __mitmproxy__ will add the cookie most
## Sticky cookies
- command-line: _-t_ (sticky cookies on all requests)
- command-line: _-T filt_ (sticky cookies on requests matching filt)
- mitmproxy shortcut: _t_
When the sticky cookie option is set, __mitmproxy__ will add the cookie most
recently set by the server to any cookie-less request. Consider a service that
sets a cookie to track the session after authentication. Using sticky cookies,
you can fire up mitmproxy, and authenticate to a service as you usually would
@@ -17,8 +23,12 @@ with the secured resources.
## Sticky auth
The __stickyauth__ option is analogous to the __stickycookie__ option, in that
HTTP __Authorization__ headers are simply replayed to the server once they have
been seen. This is enough to allow you to access a server resource using HTTP
Basic authentication through the proxy. Note that __mitmproxy__ doesn't (yet)
support replay of HTTP Digest authentication.
- command-line: _-u_ (sticky auth on all requests)
- command-line: _-U filt_ (sticky auth on requests matching filt)
- mitmproxy shortcut: _u_
The sticky auth option is analogous to the sticky cookie option, in that HTTP
__Authorization__ headers are simply replayed to the server once they have been
seen. This is enough to allow you to access a server resource using HTTP Basic
authentication through the proxy. Note that __mitmproxy__ doesn't (yet) support
replay of HTTP Digest authentication.

5
examples/README Normal file
View File

@@ -0,0 +1,5 @@
add_header.py Simple script that just adds a header to every request.
stub.py Script stub with a method definition for every event.
stickycookies An example of writing a custom proxy with libmproxy

2
examples/add_header.py Normal file
View File

@@ -0,0 +1,2 @@
def response(context, flow):
flow.response.headers["newheader"] = ["foo"]

View File

@@ -1,6 +0,0 @@
#!/usr/bin/env python
from libmproxy import script
f = script.load_flow()
f.request.headers["newheader"] = ["foo"]
script.return_flow(f)

18
examples/stickycookies.py → examples/stickycookies Normal file → Executable file
View File

@@ -1,9 +1,6 @@
#!/usr/bin/env python
from libmproxy import controller, proxy
proxy.config = proxy.Config(
"~/.mitmproxy/cert.pem"
)
class StickyMaster(controller.Master):
def __init__(self, server):
controller.Master.__init__(self, server)
@@ -17,19 +14,22 @@ class StickyMaster(controller.Master):
def handle_request(self, msg):
hid = (msg.host, msg.port)
if msg.headers.has_key("cookie"):
if msg.headers["cookie"]:
self.stickyhosts[hid] = msg.headers["cookie"]
elif hid in self.stickyhosts:
msg.headers["cookie"] = self.stickyhosts[hid]
msg.ack()
msg._ack()
def handle_response(self, msg):
hid = (msg.request.host, msg.request.port)
if msg.headers.has_key("set-cookie"):
if msg.headers["set-cookie"]:
self.stickyhosts[hid] = f.response.headers["set-cookie"]
msg.ack()
msg._ack()
server = proxy.ProxyServer(8080)
ssl_config = proxy.SSLConfig(
"~/.mitmproxy/cert.pem"
)
server = proxy.ProxyServer(ssl_config, 8080)
m = StickyMaster(server)
m.run()

48
examples/stub.py Normal file
View File

@@ -0,0 +1,48 @@
"""
This is a script stub, with definitions for all events.
"""
def start(ctx):
"""
Called once on script startup, before any other events.
"""
ctx.log("start")
def clientconnect(ctx, client_connect):
"""
Called when a client initiates a connection to the proxy. Note that a
connection can correspond to multiple HTTP requests
"""
ctx.log("clientconnect")
def request(ctx, flow):
"""
Called when a client request has been received.
"""
ctx.log("request")
def response(ctx, flow):
"""
Called when a server response has been received.
"""
ctx.log("response")
def error(ctx, flow):
"""
Called when a flow error has occured, e.g. invalid server responses, or
interrupted connections. This is distinct from a valid server HTTP error
response, which is simply a response with an HTTP error code.
"""
ctx.log("error")
def clientdisconnect(ctx, client_disconnect):
"""
Called when a client disconnects from the proxy.
"""
ctx.log("clientdisconnect")
def done(ctx):
"""
Called once on script shutdown, after any other events.
"""
ctx.log("done")

View File

@@ -16,15 +16,16 @@ def get_common_options(options):
return dict(
anticache = options.anticache,
anticomp = options.anticomp,
client_replay = options.client_replay,
eventlog = options.eventlog,
kill = options.kill,
no_server = options.no_server,
refresh_server_playback = not options.norefresh,
rheaders = options.rheaders,
rfile = options.rfile,
request_script = options.request_script,
response_script = options.response_script,
server_replay = options.server_replay,
script = options.script,
stickycookie = stickycookie,
stickyauth = stickyauth,
wfile = options.wfile,
@@ -38,11 +39,21 @@ def common_options(parser):
action="store", type = "str", dest="addr", default='',
help = "Address to bind proxy to (defaults to all interfaces)"
)
parser.add_option(
"--anticache",
action="store_true", dest="anticache", default=False,
help="Strip out request headers that might cause the server to return 304-not-modified."
)
parser.add_option(
"--confdir",
action="store", type = "str", dest="confdir", default='~/.mitmproxy',
help = "Configuration directory. (~/.mitmproxy)"
)
parser.add_option(
"-e",
action="store_true", dest="eventlog",
help="Show event log."
)
parser.add_option(
"-n",
action="store_true", dest="no_server",
@@ -64,19 +75,9 @@ def common_options(parser):
help="Read flows from file."
)
parser.add_option(
"--anticache",
action="store_true", dest="anticache", default=False,
help="Strip out request headers that might cause the server to return 304-not-modified."
)
parser.add_option(
"--reqscript",
action="store", dest="request_script", default=None,
help="Script to run when a request is recieved."
)
parser.add_option(
"--respscript",
action="store", dest="response_script", default=None,
help="Script to run when a response is recieved."
"-s",
action="store", dest="script", default=None,
help="Run a script."
)
parser.add_option(
"-t",
@@ -108,6 +109,12 @@ def common_options(parser):
action="store", dest="wfile", default=None,
help="Write flows to file."
)
parser.add_option(
"-z",
action="store_true", dest="anticomp", default=False,
help="Try to convince servers to send us un-compressed data."
)
group = optparse.OptionGroup(parser, "Client Replay")
group.add_option(
"-c",
@@ -117,14 +124,14 @@ def common_options(parser):
parser.add_option_group(group)
parser.add_option(
"--cert-wait-time",
"--cert-wait-time", type="float",
action="store", dest="cert_wait_time", default=0,
help="Wait for specified number of seconds after a new cert is generated. This can smooth over small discrepancies between the client and server times."
)
group = optparse.OptionGroup(parser, "Server Replay")
group.add_option(
"-s",
"-S",
action="store", dest="server_replay", default=None, metavar="PATH",
help="Replay server responses from a saved file."
)

File diff suppressed because it is too large Load Diff

View File

@@ -16,7 +16,7 @@
import sys
import Queue, threading
exit = False
should_exit = False
#begin nocover
@@ -25,18 +25,18 @@ class Msg:
self.q = Queue.Queue()
self.acked = False
def ack(self, data=False):
def _ack(self, data=False):
self.acked = True
if data is None:
self.q.put(data)
else:
self.q.put(data or self)
def send(self, masterq):
def _send(self, masterq):
self.acked = False
try:
masterq.put(self, timeout=3)
while not exit:
while not should_exit:
try:
g = self.q.get(timeout=0.5)
except Queue.Empty:
@@ -84,7 +84,7 @@ class Master:
if self.server:
slave = Slave(self.masterq, self.server)
slave.start()
while not exit:
while not should_exit:
self.tick(self.masterq)
self.shutdown()
@@ -94,11 +94,11 @@ class Master:
if m:
m(msg)
else:
msg.ack()
msg._ack()
def shutdown(self):
global exit
if not exit:
exit = True
global should_exit
if not should_exit:
should_exit = True
if self.server:
self.server.shutdown()

View File

@@ -1,5 +1,5 @@
import sys, os, traceback
import flow, filt, utils
import sys, os
import flow, filt, utils, script
class DumpError(Exception): pass
@@ -7,16 +7,17 @@ class DumpError(Exception): pass
class Options(object):
__slots__ = [
"anticache",
"anticomp",
"client_replay",
"eventlog",
"keepserving",
"kill",
"no_server",
"refresh_server_playback",
"request_script",
"response_script",
"rfile",
"rheaders",
"server_replay",
"script",
"stickycookie",
"stickyauth",
"verbosity",
@@ -42,7 +43,7 @@ def str_request(req):
c = req.client_conn.address[0]
else:
c = "[replay]"
r = "%s %s %s"%(c, req.method, req.url())
r = "%s %s %s"%(c, req.method, req.get_url())
if req.stickycookie:
r = "[stickycookie] " + r
return r
@@ -54,6 +55,8 @@ class DumpMaster(flow.FlowMaster):
self.outfile = outfile
self.o = options
self.anticache = options.anticache
self.anticomp = options.anticomp
self.eventlog = options.eventlog
self.refresh_server_playback = options.refresh_server_playback
if filtstr:
@@ -61,11 +64,6 @@ class DumpMaster(flow.FlowMaster):
else:
self.filt = None
if self.o.response_script:
self.set_response_script(self.o.response_script)
if self.o.request_script:
self.set_request_script(self.o.request_script)
if options.stickycookie:
self.set_stickycookie(options.stickycookie)
@@ -87,6 +85,17 @@ class DumpMaster(flow.FlowMaster):
not options.keepserving
)
if options.client_replay:
self.start_client_playback(
self._readflow(options.client_replay),
not options.keepserving
)
if options.script:
err = self.load_script(options.script)
if err:
raise DumpError(err)
if options.rfile:
path = os.path.expanduser(options.rfile)
try:
@@ -96,12 +105,6 @@ class DumpMaster(flow.FlowMaster):
raise DumpError(v.strerror)
self.load_flows(freader)
if options.client_replay:
self.start_client_playback(
self._readflow(options.client_replay),
not options.keepserving
)
def _readflow(self, path):
path = os.path.expanduser(path)
@@ -112,24 +115,15 @@ class DumpMaster(flow.FlowMaster):
raise DumpError(v.strerror)
return flows
def _runscript(self, f, script):
try:
ret = f.run_script(script)
if self.o.verbosity > 0:
print >> self.outfile, ret
except flow.RunException, e:
if e.errout:
eout = "Script output:\n" + self.indent(4, e.errout) + "\n"
else:
eout = ""
raise DumpError(
"%s: %s\n%s"%(script, e.args[0], eout)
)
def add_event(self, e, level="info"):
if self.eventlog:
print >> self.outfile, e
def handle_request(self, r):
f = flow.FlowMaster.handle_request(self, r)
self.add_event("Request: %s"%str_request(r))
if f:
r.ack()
r._ack()
return f
def indent(self, n, t):
@@ -138,7 +132,7 @@ class DumpMaster(flow.FlowMaster):
def _process_flow(self, f):
if self.filt and not f.match(self.filt):
return
return
if f.response:
sz = utils.pretty_size(len(f.response.content))
@@ -168,7 +162,7 @@ class DumpMaster(flow.FlowMaster):
print >> self.outfile
print >> self.outfile, result
print >> self.outfile, "\n"
elif self.o.verbosity == 3:
elif self.o.verbosity >= 3:
print >> self.outfile, str_request(f.request)
print >> self.outfile, self.indent(4, f.request.headers)
if utils.isBin(f.request.content):
@@ -178,6 +172,7 @@ class DumpMaster(flow.FlowMaster):
print >> self.outfile
print >> self.outfile, result
print >> self.outfile, "\n"
self.state.delete_flow(f)
if self.o.wfile:
self.fwriter.add(f)
@@ -185,24 +180,25 @@ class DumpMaster(flow.FlowMaster):
def handle_response(self, msg):
f = flow.FlowMaster.handle_response(self, msg)
if f:
msg.ack()
msg._ack()
self._process_flow(f)
return f
def handle_error(self, msg):
f = flow.FlowMaster.handle_error(self, msg)
if f:
msg.ack()
msg._ack()
self._process_flow(f)
return f
# begin nocover
def run(self):
if self.o.rfile and not self.o.keepserving:
if self.script:
self.load_script(None)
return
try:
return flow.FlowMaster.run(self)
except BaseException, v:
except BaseException:
self.shutdown()
raise

73
libmproxy/encoding.py Normal file
View File

@@ -0,0 +1,73 @@
"""
Utility functions for decoding response bodies.
"""
import cStringIO
import gzip, zlib
__ALL__ = ["ENCODINGS"]
ENCODINGS = set(["identity", "gzip", "deflate"])
def decode(e, content):
encoding_map = {
"identity": identity,
"gzip": decode_gzip,
"deflate": decode_deflate,
}
if e not in encoding_map:
return None
return encoding_map[e](content)
def encode(e, content):
encoding_map = {
"identity": identity,
"gzip": encode_gzip,
"deflate": encode_deflate,
}
if e not in encoding_map:
return None
return encoding_map[e](content)
def identity(content):
"""
Returns content unchanged. Identity is the default value of
Accept-Encoding headers.
"""
return content
def decode_gzip(content):
gfile = gzip.GzipFile(fileobj=cStringIO.StringIO(content))
try:
return gfile.read()
except IOError:
return None
def encode_gzip(content):
s = cStringIO.StringIO()
gf = gzip.GzipFile(fileobj=s, mode='wb')
gf.write(content)
gf.close()
return s.getvalue()
def decode_deflate(content):
"""
Returns decompressed data for DEFLATE. Some servers may respond with
compressed data without a zlib header or checksum. An undocumented
feature of zlib permits the lenient decompression of data missing both
values.
http://bugs.python.org/issue5784
"""
try:
try:
return zlib.decompress(content)
except zlib.error:
return zlib.decompress(content, -15)
except zlib.error:
return None
def encode_deflate(content):
"""
Returns compressed content, always including zlib header and checksum.
"""
return zlib.compress(content)

View File

@@ -60,14 +60,14 @@ class FReq(_Action):
code = "q"
help = "Match request"
def __call__(self, conn):
return not conn.is_response()
return not conn._is_response()
class FResp(_Action):
code = "s"
help = "Match response"
def __call__(self, conn):
return conn.is_response()
return conn._is_response()
class _Rex(_Action):
@@ -79,7 +79,7 @@ class _Rex(_Action):
raise ValueError, "Cannot compile expression."
def _check_content_type(expr, o):
val = o.headers.get("content-type")
val = o.headers["content-type"]
if val and re.search(expr, val[0]):
return True
return False
@@ -91,7 +91,7 @@ class FContentType(_Rex):
def __call__(self, o):
if _check_content_type(self.expr, o):
return True
elif o.is_response() and _check_content_type(self.expr, o.request):
elif o._is_response() and _check_content_type(self.expr, o.request):
return True
else:
return False
@@ -101,7 +101,7 @@ class FRequestContentType(_Rex):
code = "tq"
help = "Request Content-Type header"
def __call__(self, o):
if o.is_response():
if o._is_response():
return _check_content_type(self.expr, o.request)
else:
return _check_content_type(self.expr, o)
@@ -111,7 +111,7 @@ class FResponseContentType(_Rex):
code = "ts"
help = "Request Content-Type header"
def __call__(self, o):
if o.is_response():
if o._is_response():
return _check_content_type(self.expr, o)
else:
return False
@@ -122,7 +122,7 @@ class FHead(_Rex):
help = "Header"
def __call__(self, o):
val = o.headers.match_re(self.expr)
if not val and o.is_response():
if not val and o._is_response():
val = o.request.headers.match_re(self.expr)
return val
@@ -131,7 +131,7 @@ class FHeadRequest(_Rex):
code = "hq"
help = "Request header"
def __call__(self, o):
if o.is_response():
if o._is_response():
h = o.request.headers
else:
h = o.headers
@@ -142,7 +142,7 @@ class FHeadResponse(_Rex):
code = "hs"
help = "Response header"
def __call__(self, o):
if not o.is_response():
if not o._is_response():
return False
return o.headers.match_re(self.expr)
@@ -153,7 +153,7 @@ class FBod(_Rex):
def __call__(self, o):
if o.content and re.search(self.expr, o.content):
return True
elif o.is_response() and o.request.content and re.search(self.expr, o.request.content):
elif o._is_response() and o.request.content and re.search(self.expr, o.request.content):
return True
return False
@@ -162,9 +162,9 @@ class FBodRequest(_Rex):
code = "bq"
help = "Request body"
def __call__(self, o):
if o.is_response() and o.request.content and re.search(self.expr, o.request.content):
if o._is_response() and o.request.content and re.search(self.expr, o.request.content):
return True
elif not o.is_response() and o.content and re.search(self.expr, o.content):
elif not o._is_response() and o.content and re.search(self.expr, o.content):
return True
return False
@@ -173,7 +173,7 @@ class FBodResponse(_Rex):
code = "bs"
help = "Response body"
def __call__(self, o):
if not o.is_response():
if not o._is_response():
return False
elif o.content and re.search(self.expr, o.content):
return True
@@ -191,11 +191,11 @@ class FUrl(_Rex):
return klass(*toks)
def __call__(self, o):
if o.is_response():
if o._is_response():
c = o.request
else:
c = o
return re.search(self.expr, c.url())
return re.search(self.expr, c.get_url())
class _Int(_Action):
@@ -207,7 +207,7 @@ class FCode(_Int):
code = "c"
help = "HTTP response code"
def __call__(self, o):
if o.is_response():
if o._is_response():
return o.code == self.num
return False
@@ -315,6 +315,6 @@ def parse(s):
return bnf.parseString(s, parseAll=True)[0]
except pp.ParseException:
return None
except ValueError, e:
except ValueError:
return None

File diff suppressed because it is too large Load Diff

View File

@@ -1,15 +1,14 @@
"""
A simple proxy server implementation, which always reads all of a server
response into memory, performs some transformation, and then writes it back
to the client.
to the client.
Development started from Neil Schemenauer's munchy.py
"""
import sys, os, string, socket, urlparse, re, select, copy, base64, time, Cookie
from email.utils import parsedate_tz, formatdate, mktime_tz
import shutil, tempfile
import sys, os, string, socket, select, time
import shutil, tempfile, threading
import optparse, SocketServer, ssl
import utils, controller
import utils, flow
NAME = "mitmproxy"
@@ -53,14 +52,14 @@ def read_chunked(fp):
if line == '\r\n' or line == '\n':
break
return content
def read_http_body(rfile, connection, headers, all):
if headers.has_key('transfer-encoding'):
if 'transfer-encoding' in headers:
if not ",".join(headers["transfer-encoding"]) == "chunked":
raise IOError('Invalid transfer-encoding')
content = read_chunked(rfile)
elif headers.has_key("content-length"):
elif "content-length" in headers:
content = rfile.read(int(headers["content-length"][0]))
elif all:
content = rfile.read()
@@ -70,28 +69,6 @@ def read_http_body(rfile, connection, headers, all):
return content
def parse_url(url):
"""
Returns a (scheme, host, port, path) tuple, or None on error.
"""
scheme, netloc, path, params, query, fragment = urlparse.urlparse(url)
if not scheme:
return None
if ':' in netloc:
host, port = string.split(netloc, ':')
port = int(port)
else:
host = netloc
if scheme == "https":
port = 443
else:
port = 80
path = urlparse.urlunparse(('', '', path, params, query, fragment))
if not path.startswith("/"):
path = "/" + path
return scheme, host, port, path
def parse_request_line(request):
"""
Parse a proxy request line. Return (method, scheme, host, port, path, minor).
@@ -113,7 +90,7 @@ def parse_request_line(request):
if url.startswith("/") or url == "*":
scheme, port, host, path = None, None, None, url
else:
parts = parse_url(url)
parts = utils.parse_url(url)
if not parts:
raise ProxyError(400, "Invalid url: %s"%url)
scheme, host, port, path = parts
@@ -125,339 +102,7 @@ def parse_request_line(request):
if major != 1:
raise ProxyError(400, "Unsupported protocol")
return method, scheme, host, port, path, minor
class Request(controller.Msg):
FMT = '%s %s HTTP/1.1\r\n%s\r\n%s'
FMT_PROXY = '%s %s://%s:%s%s HTTP/1.1\r\n%s\r\n%s'
def __init__(self, client_conn, host, port, scheme, method, path, headers, content, timestamp=None):
self.client_conn = client_conn
self.host, self.port, self.scheme = host, port, scheme
self.method, self.path, self.headers, self.content = method, path, headers, content
self.timestamp = timestamp or utils.timestamp()
self.close = False
controller.Msg.__init__(self)
# Have this request's cookies been modified by sticky cookies or auth?
self.stickycookie = False
self.stickyauth = False
def anticache(self):
"""
Modifies this request to remove headers that might produce a cached
response. That is, we remove ETags and If-Modified-Since headers.
"""
delheaders = [
"if-modified-since",
"if-none-match",
]
for i in delheaders:
if i in self.headers:
del self.headers[i]
def set_replay(self):
self.client_conn = None
def is_replay(self):
if self.client_conn:
return False
else:
return True
def load_state(self, state):
if state["client_conn"]:
if self.client_conn:
self.client_conn.load_state(state["client_conn"])
else:
self.client_conn = ClientConnect.from_state(state["client_conn"])
else:
self.client_conn = None
self.host = state["host"]
self.port = state["port"]
self.scheme = state["scheme"]
self.method = state["method"]
self.path = state["path"]
self.headers = utils.Headers.from_state(state["headers"])
self.content = base64.decodestring(state["content"])
self.timestamp = state["timestamp"]
def get_state(self):
return dict(
client_conn = self.client_conn.get_state() if self.client_conn else None,
host = self.host,
port = self.port,
scheme = self.scheme,
method = self.method,
path = self.path,
headers = self.headers.get_state(),
content = base64.encodestring(self.content),
timestamp = self.timestamp,
)
@classmethod
def from_state(klass, state):
return klass(
ClientConnect.from_state(state["client_conn"]),
state["host"],
state["port"],
state["scheme"],
state["method"],
state["path"],
utils.Headers.from_state(state["headers"]),
base64.decodestring(state["content"]),
state["timestamp"]
)
def __hash__(self):
return id(self)
def __eq__(self, other):
return self.get_state() == other.get_state()
def copy(self):
c = copy.copy(self)
c.headers = self.headers.copy()
return c
def hostport(self):
if (self.port, self.scheme) in [(80, "http"), (443, "https")]:
host = self.host
else:
host = "%s:%s"%(self.host, self.port)
return host
def url(self):
return "%s://%s%s"%(self.scheme, self.hostport(), self.path)
def set_url(self, url):
parts = parse_url(url)
if not parts:
return False
self.scheme, self.host, self.port, self.path = parts
return True
def is_response(self):
return False
def assemble(self, _proxy = False):
"""
Assembles the request for transmission to the server. We make some
modifications to make sure interception works properly.
"""
headers = self.headers.copy()
utils.try_del(headers, 'accept-encoding')
utils.try_del(headers, 'proxy-connection')
utils.try_del(headers, 'keep-alive')
utils.try_del(headers, 'connection')
utils.try_del(headers, 'content-length')
utils.try_del(headers, 'transfer-encoding')
if not headers.has_key('host'):
headers["host"] = [self.hostport()]
content = self.content
if content is not None:
headers["content-length"] = [str(len(content))]
else:
content = ""
if self.close:
headers["connection"] = ["close"]
if not _proxy:
return self.FMT % (self.method, self.path, str(headers), content)
else:
return self.FMT_PROXY % (self.method, self.scheme, self.host, self.port, self.path, str(headers), content)
class Response(controller.Msg):
FMT = '%s\r\n%s\r\n%s'
def __init__(self, request, code, msg, headers, content, timestamp=None):
self.request = request
self.code, self.msg = code, msg
self.headers, self.content = headers, content
self.timestamp = timestamp or utils.timestamp()
controller.Msg.__init__(self)
self.replay = False
def _refresh_cookie(self, c, delta):
"""
Takes a cookie string c and a time delta in seconds, and returns
a refreshed cookie string.
"""
c = Cookie.SimpleCookie(str(c))
for i in c.values():
if "expires" in i:
d = parsedate_tz(i["expires"])
if d:
d = mktime_tz(d) + delta
i["expires"] = formatdate(d)
else:
# This can happen when the expires tag is invalid.
# reddit.com sends a an expires tag like this: "Thu, 31 Dec
# 2037 23:59:59 GMT", which is valid RFC 1123, but not
# strictly correct according tot he cookie spec. Browsers
# appear to parse this tolerantly - maybe we should too.
# For now, we just ignore this.
del i["expires"]
return c.output(header="").strip()
def refresh(self, now=None):
"""
This fairly complex and heuristic function refreshes a server
response for replay.
- It adjusts date, expires and last-modified headers.
- It adjusts cookie expiration.
"""
if not now:
now = time.time()
delta = now - self.timestamp
refresh_headers = [
"date",
"expires",
"last-modified",
]
for i in refresh_headers:
if i in self.headers:
d = parsedate_tz(self.headers[i][0])
if d:
new = mktime_tz(d) + delta
self.headers[i] = [formatdate(new)]
c = []
for i in self.headers.get("set-cookie", []):
c.append(self._refresh_cookie(i, delta))
if c:
self.headers["set-cookie"] = c
def set_replay(self):
self.replay = True
def is_replay(self):
return self.replay
def load_state(self, state):
self.code = state["code"]
self.msg = state["msg"]
self.headers = utils.Headers.from_state(state["headers"])
self.content = base64.decodestring(state["content"])
self.timestamp = state["timestamp"]
def get_state(self):
return dict(
code = self.code,
msg = self.msg,
headers = self.headers.get_state(),
timestamp = self.timestamp,
content = base64.encodestring(self.content)
)
@classmethod
def from_state(klass, request, state):
return klass(
request,
state["code"],
state["msg"],
utils.Headers.from_state(state["headers"]),
base64.decodestring(state["content"]),
state["timestamp"],
)
def __eq__(self, other):
return self.get_state() == other.get_state()
def copy(self):
c = copy.copy(self)
c.headers = self.headers.copy()
return c
def is_response(self):
return True
def assemble(self):
"""
Assembles the response for transmission to the client. We make some
modifications to make sure interception works properly.
"""
headers = self.headers.copy()
utils.try_del(headers, 'accept-encoding')
utils.try_del(headers, 'proxy-connection')
utils.try_del(headers, 'connection')
utils.try_del(headers, 'keep-alive')
utils.try_del(headers, 'transfer-encoding')
content = self.content
if content is not None:
headers["content-length"] = [str(len(content))]
else:
content = ""
if self.request.client_conn.close:
headers["connection"] = ["close"]
proto = "HTTP/1.1 %s %s"%(self.code, str(self.msg))
data = (proto, str(headers), content)
return self.FMT%data
class ClientDisconnect(controller.Msg):
def __init__(self, client_conn):
controller.Msg.__init__(self)
self.client_conn = client_conn
class ClientConnect(controller.Msg):
def __init__(self, address):
"""
address is an (address, port) tuple, or None if this connection has
been replayed from within mitmproxy.
"""
self.address = address
self.close = False
controller.Msg.__init__(self)
def __eq__(self, other):
return self.get_state() == other.get_state()
def load_state(self, state):
self.address = state
def get_state(self):
return list(self.address) if self.address else None
@classmethod
def from_state(klass, state):
if state:
return klass(state)
else:
return None
def copy(self):
return copy.copy(self)
class Error(controller.Msg):
def __init__(self, request, msg, timestamp=None):
self.request, self.msg = request, msg
self.timestamp = timestamp or utils.timestamp()
controller.Msg.__init__(self)
def load_state(self, state):
self.msg = state["msg"]
self.timestamp = state["timestamp"]
def copy(self):
return copy.copy(self)
def get_state(self):
return dict(
msg = self.msg,
timestamp = self.timestamp,
)
@classmethod
def from_state(klass, state):
return klass(
None,
state["msg"],
state["timestamp"],
)
def __eq__(self, other):
return self.get_state() == other.get_state()
class FileLike:
@@ -496,6 +141,21 @@ class FileLike:
#begin nocover
class RequestReplayThread(threading.Thread):
def __init__(self, flow, masterq):
self.flow, self.masterq = flow, masterq
threading.Thread.__init__(self)
def run(self):
try:
server = ServerConnection(self.flow.request)
server.send_request(self.flow.request)
response = server.read_response()
response._send(self.masterq)
except ProxyError, v:
err = flow.Error(self.flow.request, v.msg)
err._send(self.masterq)
class ServerConnection:
def __init__(self, request):
@@ -522,7 +182,7 @@ class ServerConnection:
self.request = request
request.close = self.close
try:
self.wfile.write(request.assemble())
self.wfile.write(request._assemble())
self.wfile.flush()
except socket.error, err:
raise ProxyError(504, 'Error sending data to "%s": %s' % (request.host, err))
@@ -538,7 +198,7 @@ class ServerConnection:
raise ProxyError(502, "Invalid server response: %s."%line)
proto, code, msg = parts
code = int(code)
headers = utils.Headers()
headers = flow.Headers()
headers.read(self.rfile)
if code >= 100 and code <= 199:
return self.read_response()
@@ -546,7 +206,7 @@ class ServerConnection:
content = ""
else:
content = read_http_body(self.rfile, self, headers, True)
return Response(self.request, code, msg, headers, content)
return flow.Response(self.request, code, msg, headers, content)
def terminate(self):
try:
@@ -564,48 +224,58 @@ class ProxyHandler(SocketServer.StreamRequestHandler):
SocketServer.StreamRequestHandler.__init__(self, request, client_address, server)
def handle(self):
cc = ClientConnect(self.client_address)
cc.send(self.mqueue)
cc = flow.ClientConnect(self.client_address)
cc._send(self.mqueue)
while not cc.close:
self.handle_request(cc)
cd = ClientDisconnect(cc)
cd.send(self.mqueue)
cd = flow.ClientDisconnect(cc)
cd._send(self.mqueue)
self.finish()
def handle_request(self, cc):
server, request, err = None, None, None
try:
request = self.read_request(cc)
try:
request = self.read_request(cc)
except IOError, v:
raise IOError, "Reading request: %s"%v
if request is None:
cc.close = True
return
request = request.send(self.mqueue)
cc.requestcount += 1
request = request._send(self.mqueue)
if request is None:
cc.close = True
return
if request.is_response():
if request._is_response():
response = request
request = False
response = response.send(self.mqueue)
response = response._send(self.mqueue)
else:
server = ServerConnection(request)
server.send_request(request)
response = server.read_response()
response = response.send(self.mqueue)
try:
response = server.read_response()
except IOError, v:
raise IOError, "Reading response: %s"%v
response = response._send(self.mqueue)
if response is None:
server.terminate()
if response is None:
cc.close = True
return
self.send_response(response)
except IOError:
except IOError, v:
cc.connection_error = v
cc.close = True
except ProxyError, e:
err = Error(request, e.msg)
err.send(self.mqueue)
cc.close = True
self.send_error(e.code, e.msg)
cc.connection_error = "%s: %s"%(e.code, e.msg)
if request:
err = flow.Error(request, e.msg)
err._send(self.mqueue)
self.send_error(e.code, e.msg)
if server:
server.terminate()
@@ -644,7 +314,7 @@ class ProxyHandler(SocketServer.StreamRequestHandler):
keyfile = self.config.certfile or self.config.cacert,
server_side = True,
ssl_version = ssl.PROTOCOL_SSLv23,
do_handshake_on_connect = False
do_handshake_on_connect = True,
)
if sys.version_info[1] > 6:
kwargs["ciphers"] = self.config.ciphers
@@ -654,9 +324,9 @@ class ProxyHandler(SocketServer.StreamRequestHandler):
method, scheme, host, port, path, httpminor = parse_request_line(self.rfile.readline())
if scheme is None:
scheme = "https"
headers = utils.Headers()
headers = flow.Headers()
headers.read(self.rfile)
if host is None and headers.has_key("host"):
if host is None and "host" in headers:
netloc = headers["host"][0]
if ':' in netloc:
host, port = string.split(netloc, ':')
@@ -669,8 +339,10 @@ class ProxyHandler(SocketServer.StreamRequestHandler):
port = 80
port = int(port)
if host is None:
raise ProxyError(400, 'Invalid request: %s'%request)
if headers.has_key('expect'):
# FIXME: We only specify the first part of the invalid request in this error.
# We should gather up everything read from the socket, and specify it all.
raise ProxyError(400, 'Invalid request: %s'%line)
if "expect" in headers:
expect = ",".join(headers['expect'])
if expect == "100-continue" and httpminor >= 1:
self.wfile.write('HTTP/1.1 100 Continue\r\n')
@@ -681,7 +353,7 @@ class ProxyHandler(SocketServer.StreamRequestHandler):
raise ProxyError(417, 'Unmet expect: %s'%expect)
if httpminor == 0:
client_conn.close = True
if headers.has_key('connection'):
if "connection" in headers:
for value in ",".join(headers['connection']).split(","):
value = value.strip()
if value == "close":
@@ -689,10 +361,10 @@ class ProxyHandler(SocketServer.StreamRequestHandler):
if value == "keep-alive":
client_conn.close = False
content = read_http_body(self.rfile, client_conn, headers, False)
return Request(client_conn, host, port, scheme, method, path, headers, content)
return flow.Request(client_conn, host, port, scheme, method, path, headers, content)
def send_response(self, response):
self.wfile.write(response.assemble())
self.wfile.write(response._assemble())
self.wfile.flush()
def terminate(self, connection, wfile, rfile):
@@ -771,7 +443,6 @@ def certificate_option_group(parser):
def process_certificate_option_group(parser, options):
conf = {}
if options.cert:
options.cert = os.path.expanduser(options.cert)
if not os.path.exists(options.cert):

View File

@@ -1,27 +1,55 @@
"""
The mitmproxy scripting interface is simple - a serialized representation
of a flow is passed to the script on stdin, and a possibly modified flow is
then read by mitmproxy from the scripts stdout. This module provides two
convenience functions to make loading and returning data from scripts
simple.
"""
import sys, base64
import flow
import os, traceback
class ScriptError(Exception):
pass
#begin nocover
def load_flow():
class Script:
"""
Load a flow from the stdin. Returns a Flow object.
"""
data = sys.stdin.read()
return flow.Flow.script_deserialize(data)
The instantiator should do something along this vein:
def return_flow(f):
s = Script(path, master)
s.load()
s.run("start")
"""
Print a flow to stdout.
"""
print >> sys.stdout, f.script_serialize()
def __init__(self, path, ctx):
self.path, self.ctx = path, ctx
self.ns = None
def load(self):
"""
Loads a module.
Raises ScriptError on failure, with argument equal to an error
message that may be a formatted traceback.
"""
path = os.path.expanduser(self.path)
if not os.path.exists(path):
raise ScriptError("No such file: %s"%self.path)
if not os.path.isfile(path):
raise ScriptError("Not a file: %s"%self.path)
ns = {}
try:
execfile(path, ns, ns)
except Exception, v:
raise ScriptError(traceback.format_exc(v))
self.ns = ns
def run(self, name, *args, **kwargs):
"""
Runs a plugin method.
Returns:
(True, retval) on success.
(False, None) on nonexistent method.
(Fals, (exc, traceback string)) if there was an exception.
"""
f = self.ns.get(name)
if f:
try:
return (True, f(self.ctx, *args, **kwargs))
except Exception, v:
return (False, (v, traceback.format_exc(v)))
else:
return (False, None)

View File

@@ -1,20 +1,23 @@
# Copyright (C) 2010 Aldo Cortesi
#
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import re, os, subprocess, datetime, textwrap, errno, sys, time, functools
import re, os, subprocess, datetime, urlparse, string
import time, functools, cgi, textwrap
import json
CERT_SLEEP_TIME = 1
CERT_EXPIRY = str(365 * 3)
def timestamp():
"""
@@ -64,7 +67,7 @@ def cleanBin(s):
if i not in "\n\r\t":
parts.append(".")
return "".join(parts)
TAG = r"""
<\s*
@@ -111,6 +114,18 @@ def pretty_xmlish(s):
return data
def pretty_json(s):
try:
p = json.loads(s)
except ValueError:
return None
return json.dumps(p, sort_keys=True, indent=4).split("\n")
def urldecode(s):
return cgi.parse_qsl(s)
def hexdump(s):
"""
Returns a set of typles:
@@ -130,182 +145,12 @@ def hexdump(s):
return parts
def isStringLike(anobj):
try:
# Avoid succeeding expensively if anobj is large.
anobj[:0]+''
except:
return 0
else:
return 1
def del_all(dict, keys):
for key in keys:
if key in dict:
del dict[key]
def isSequenceLike(anobj):
"""
Is anobj a non-string sequence type (list, tuple, iterator, or
similar)? Crude, but mostly effective.
"""
if not hasattr(anobj, "next"):
if isStringLike(anobj):
return 0
try:
anobj[:0]
except:
return 0
return 1
def _caseless(s):
return s.lower()
def try_del(dict, key):
try:
del dict[key]
except KeyError:
pass
class MultiDict:
"""
Simple wrapper around a dictionary to make holding multiple objects per
key easier.
Note that this class assumes that keys are strings.
Keys have no order, but the order in which values are added to a key is
preserved.
"""
# This ridiculous bit of subterfuge is needed to prevent the class from
# treating this as a bound method.
_helper = (str,)
def __init__(self):
self._d = dict()
def copy(self):
m = self.__class__()
m._d = self._d.copy()
return m
def clear(self):
return self._d.clear()
def get(self, key, d=None):
key = self._helper[0](key)
return self._d.get(key, d)
def __contains__(self, key):
key = self._helper[0](key)
return self._d.__contains__(key)
def __eq__(self, other):
return dict(self) == dict(other)
def __delitem__(self, key):
self._d.__delitem__(key)
def __getitem__(self, key):
key = self._helper[0](key)
return self._d.__getitem__(key)
def __setitem__(self, key, value):
if not isSequenceLike(value):
raise ValueError, "Cannot insert non-sequence."
key = self._helper[0](key)
return self._d.__setitem__(key, value)
def has_key(self, key):
key = self._helper[0](key)
return self._d.has_key(key)
def setdefault(self, key, default=None):
key = self._helper[0](key)
return self._d.setdefault(key, default)
def keys(self):
return self._d.keys()
def extend(self, key, value):
if not self.has_key(key):
self[key] = []
self[key].extend(value)
def append(self, key, value):
self.extend(key, [value])
def itemPairs(self):
"""
Yield all possible pairs of items.
"""
for i in self.keys():
for j in self[i]:
yield (i, j)
def get_state(self):
return list(self.itemPairs())
@classmethod
def from_state(klass, state):
md = klass()
for i in state:
md.append(*i)
return md
class Headers(MultiDict):
"""
A dictionary-like class for keeping track of HTTP headers.
It is case insensitive, and __repr__ formats the headers correcty for
output to the server.
"""
_helper = (_caseless,)
def __repr__(self):
"""
Returns a string containing a formatted header string.
"""
headerElements = []
for key in sorted(self.keys()):
for val in self[key]:
headerElements.append(key + ": " + val)
headerElements.append("")
return "\r\n".join(headerElements)
def match_re(self, expr):
"""
Match the regular expression against each header (key, value) pair.
"""
for k, v in self.itemPairs():
s = "%s: %s"%(k, v)
if re.search(expr, s):
return True
return False
def read(self, fp):
"""
Read a set of headers from a file pointer. Stop once a blank line
is reached.
"""
name = ''
while 1:
line = fp.readline()
if not line or line == '\r\n' or line == '\n':
break
if line[0] in ' \t':
# continued header
self[name][-1] = self[name][-1] + '\r\n ' + line.strip()
else:
i = line.find(':')
# We're being liberal in what we accept, here.
if i > 0:
name = line[:i]
value = line[i+1:].strip()
if self.has_key(name):
# merge value
self.append(name, value)
else:
self[name] = [value]
def pretty_size(size):
suffixes = [
@@ -340,7 +185,7 @@ class Data:
if not os.path.exists(fullpath):
raise ValueError, "dataPath: %s does not exist."%fullpath
return fullpath
data = Data(__name__)
pkg_data = Data(__name__)
def dummy_ca(path):
@@ -365,9 +210,9 @@ def dummy_ca(path):
"req",
"-new",
"-x509",
"-config", data.path("resources/ca.cnf"),
"-config", pkg_data.path("resources/ca.cnf"),
"-nodes",
"-days", "9999",
"-days", CERT_EXPIRY,
"-out", path,
"-newkey", "rsa:1024",
"-keyout", path,
@@ -437,8 +282,10 @@ def dummy_cert(certdir, ca, commonname):
confpath = os.path.join(certdir, commonname + ".cnf")
reqpath = os.path.join(certdir, commonname + ".req")
template = open(data.path("resources/cert.cnf")).read()
f = open(confpath, "w").write(template%(dict(commonname=commonname)))
template = open(pkg_data.path("resources/cert.cnf")).read()
f = open(confpath, "w")
f.write(template%(dict(commonname=commonname)))
f.close()
if ca:
# Create a dummy signed certificate. Uses same key as the signing CA
@@ -462,7 +309,7 @@ def dummy_cert(certdir, ca, commonname):
"x509",
"-req",
"-in", reqpath,
"-days", "9999",
"-days", CERT_EXPIRY,
"-out", certpath,
"-CA", ca,
"-CAcreateserial",
@@ -485,7 +332,7 @@ def dummy_cert(certdir, ca, commonname):
"-x509",
"-config", confpath,
"-nodes",
"-days", "9999",
"-days", CERT_EXPIRY,
"-out", certpath,
"-newkey", "rsa:1024",
"-keyout", certpath,
@@ -536,3 +383,27 @@ class LRUCache:
cache.pop(d)
return ret
return wrap
def parse_url(url):
"""
Returns a (scheme, host, port, path) tuple, or None on error.
"""
scheme, netloc, path, params, query, fragment = urlparse.urlparse(url)
if not scheme:
return None
if ':' in netloc:
host, port = string.split(netloc, ':')
port = int(port)
else:
host = netloc
if scheme == "https":
port = 443
else:
port = 80
path = urlparse.urlunparse(('', '', path, params, query, fragment))
if not path.startswith("/"):
path = "/" + path
return scheme, host, port, path

View File

@@ -1,2 +1,2 @@
IVERSION = (0, 5)
IVERSION = (0, 6)
VERSION = ".".join([str(i) for i in IVERSION])

View File

@@ -72,7 +72,7 @@ packages, package_data = findPackages("libmproxy")
setup(
name = "mitmproxy",
version = version.VERSION,
description = "An interactive SSL-capable intercepting HTTP proxy for penetration testers and software developers.",
description = "An interactive, SSL-capable man-in-the-middle HTTP proxy for penetration testers and software developers.",
long_description = long_description,
author = "Aldo Cortesi",
author_email = "aldo@corte.si",
@@ -81,10 +81,14 @@ setup(
package_data = package_data,
scripts = ["mitmproxy", "mitmdump"],
classifiers = [
"Development Status :: 4 - Beta",
"License :: OSI Approved :: GNU General Public License (GPL)",
"Development Status :: 5 - Production/Stable",
"Environment :: Console",
"Environment :: Console :: Curses",
"Programming Language :: Python",
"Topic :: Security",
"Topic :: Internet :: WWW/HTTP",
"Topic :: Internet :: Proxy Servers",
"Topic :: Software Development :: Testing"
]
)

View File

@@ -1,9 +0,0 @@
#!/usr/bin/env python
import sys
sys.path.insert(0, "..")
from libmproxy import script
f = script.load_flow()
f.request.host = "TESTOK"
print >> sys.stderr, "DEBUG"
script.return_flow(f)

9
test/scripts/a.py Normal file
View File

@@ -0,0 +1,9 @@
var = 0
def here(ctx):
global var
var += 1
return var
def errargs():
pass

20
test/scripts/all.py Normal file
View File

@@ -0,0 +1,20 @@
log = []
def clientconnect(ctx, cc):
ctx.log("XCLIENTCONNECT")
log.append("clientconnect")
def request(ctx, r):
ctx.log("XREQUEST")
log.append("request")
def response(ctx, r):
ctx.log("XRESPONSE")
log.append("response")
def clientdisconnect(ctx, cc):
ctx.log("XCLIENTDISCONNECT")
log.append("clientdisconnect")
def error(ctx, cc):
ctx.log("XERROR")
log.append("error")

View File

@@ -1,2 +0,0 @@
#!/usr/bin/env python
print "NONSENSE"

View File

@@ -1,6 +0,0 @@
#!/usr/bin/env python
import sys
sys.path.insert(0, "..")
print >> sys.stderr, "output"
sys.exit(1)

3
test/scripts/loaderr.py Normal file
View File

@@ -0,0 +1,3 @@
a = x

3
test/scripts/starterr.py Normal file
View File

@@ -0,0 +1,3 @@
def start(ctx):
raise ValueError

View File

@@ -0,0 +1,3 @@
a +

View File

@@ -1,4 +1,4 @@
import socket, os, cStringIO, tempfile
import socket
from SocketServer import BaseServer
from BaseHTTPServer import HTTPServer
import handler

View File

@@ -1,4 +1,4 @@
import socket, os, cStringIO, tempfile
import socket
from SocketServer import BaseServer
from BaseHTTPServer import HTTPServer
import ssl

View File

@@ -1,4 +1,4 @@
from libmproxy import console, proxy, filt, flow
from libmproxy import console, filt, flow
import tutils
import libpry
@@ -12,7 +12,7 @@ class uState(libpry.AutoTree):
"""
c = console.ConsoleState()
f = self._add_request(c)
assert f.request in c.flow_map
assert f.request in c._flow_map
assert c.get_focus() == (f, 0)
def test_focus(self):
@@ -103,8 +103,8 @@ class uformat_flow(libpry.AutoTree):
f.request = f.response.request
f.backup()
f.request.set_replay()
f.response.set_replay()
f.request._set_replay()
f.response._set_replay()
assert ('method', '[replay]') in console.format_flow(f, True)
assert ('method', '[replay]') in console.format_flow(f, True, True)
@@ -114,7 +114,7 @@ class uformat_flow(libpry.AutoTree):
assert ('text', ' text/html') in console.format_flow(f, True, True)
f.response =None
f.error = proxy.Error(f.request, "error")
f.error = flow.Error(f.request, "error")
assert ('error', 'error') in console.format_flow(f, True, True)
@@ -126,9 +126,10 @@ class uPathCompleter(libpry.AutoTree):
c.reset()
assert c.complete("./completion/a") == "./completion/aaa"
assert c.complete("./completion/a") == "./completion/aab"
c.reset()
assert c.complete("./completion/aaa") == "./completion/aaa"
assert c.complete("./completion/aaa") == "./completion/aab"
assert c.complete("./completion/aaa") == "./completion/aaa"
def test_completion(self):

View File

@@ -1,13 +1,13 @@
import os
from cStringIO import StringIO
import libpry
from libmproxy import dump, flow
from libmproxy import dump, flow, proxy
import tutils
class uStrFuncs(libpry.AutoTree):
def test_all(self):
t = tutils.tresp()
t.set_replay()
t._set_replay()
dump.str_response(t)
t = tutils.treq()
@@ -22,19 +22,30 @@ class uDumpMaster(libpry.AutoTree):
req = tutils.treq()
req.content = content
cc = req.client_conn
cc.connection_error = "error"
resp = tutils.tresp(req)
resp.content = content
m.handle_clientconnect(cc)
m.handle_request(req)
m.handle_response(resp)
m.handle_clientdisconnect(flow.ClientDisconnect(cc))
def _dummy_cycle(self, filt, content, **options):
def _dummy_cycle(self, n, filt, content, **options):
cs = StringIO()
o = dump.Options(**options)
m = dump.DumpMaster(None, o, filt, outfile=cs)
self._cycle(m, content)
for i in range(n):
self._cycle(m, content)
return cs.getvalue()
def _flowfile(self, path):
f = open(path, "w")
fw = flow.FlowWriter(f)
t = tutils.tflow_full()
t.response = tutils.tresp(t.request)
fw.add(t)
f.close()
def test_replay(self):
cs = StringIO()
@@ -43,12 +54,7 @@ class uDumpMaster(libpry.AutoTree):
t = self.tmpdir()
p = os.path.join(t, "rep")
f = open(p, "w")
fw = flow.FlowWriter(f)
t = tutils.tflow_full()
t.response = tutils.tresp(t.request)
fw.add(t)
f.close()
self._flowfile(p)
o = dump.Options(server_replay=p, kill=True)
m = dump.DumpMaster(None, o, None, outfile=cs)
@@ -63,66 +69,70 @@ class uDumpMaster(libpry.AutoTree):
o = dump.Options(client_replay=p, kill=False)
m = dump.DumpMaster(None, o, None, outfile=cs)
def test_read(self):
t = self.tmpdir()
p = os.path.join(t, "read")
self._flowfile(p)
assert "GET" in self._dummy_cycle(0, None, "", verbosity=1, rfile=p)
libpry.raises(
dump.DumpError, self._dummy_cycle,
0, None, "", verbosity=1, rfile="/nonexistent"
)
def test_options(self):
o = dump.Options(verbosity = 2)
assert o.verbosity == 2
libpry.raises(AttributeError, dump.Options, nonexistent = 2)
def test_filter(self):
assert not "GET" in self._dummy_cycle("~u foo", "", verbosity=1)
assert not "GET" in self._dummy_cycle(1, "~u foo", "", verbosity=1)
def test_basic(self):
for i in (1, 2, 3):
assert "GET" in self._dummy_cycle("~s", "", verbosity=i)
assert "GET" in self._dummy_cycle("~s", "\x00\x00\x00", verbosity=i)
assert "GET" in self._dummy_cycle("~s", "ascii", verbosity=i)
assert "GET" in self._dummy_cycle(1, "~s", "", verbosity=i, eventlog=True)
assert "GET" in self._dummy_cycle(1, "~s", "\x00\x00\x00", verbosity=i)
assert "GET" in self._dummy_cycle(1, "~s", "ascii", verbosity=i)
def test_write(self):
d = self.tmpdir()
p = os.path.join(d, "a")
self._dummy_cycle(None, "", wfile=p, verbosity=0)
self._dummy_cycle(1, None, "", wfile=p, verbosity=0)
assert len(list(flow.FlowReader(open(p)).stream())) == 1
def test_write_err(self):
libpry.raises(
dump.DumpError,
self._dummy_cycle,
1,
None,
"",
wfile = "nonexistentdir/foo"
)
def test_request_script(self):
ret = self._dummy_cycle(None, "", request_script="scripts/a", verbosity=1)
assert "TESTOK" in ret
assert "DEBUG" in ret
def test_script(self):
ret = self._dummy_cycle(
1, None, "",
script="scripts/all.py", verbosity=0, eventlog=True
)
assert "XCLIENTCONNECT" in ret
assert "XREQUEST" in ret
assert "XRESPONSE" in ret
assert "XCLIENTDISCONNECT" in ret
libpry.raises(
dump.DumpError,
self._dummy_cycle, None, "", request_script="nonexistent"
self._dummy_cycle, 1, None, "", script="nonexistent"
)
libpry.raises(
dump.DumpError,
self._dummy_cycle, None, "", request_script="scripts/err_return"
)
def test_response_script(self):
ret = self._dummy_cycle(None, "", response_script="scripts/a", verbosity=1)
assert "TESTOK" in ret
assert "DEBUG" in ret
libpry.raises(
dump.DumpError,
self._dummy_cycle, None, "", response_script="nonexistent"
)
libpry.raises(
dump.DumpError,
self._dummy_cycle, None, "", response_script="scripts/err_return"
self._dummy_cycle, 1, None, "", script="starterr.py"
)
def test_stickycookie(self):
ret = self._dummy_cycle(None, "", stickycookie = ".*")
self._dummy_cycle(1, None, "", stickycookie = ".*")
def test_stickyauth(self):
ret = self._dummy_cycle(None, "", stickyauth = ".*")
self._dummy_cycle(1, None, "", stickyauth = ".*")

29
test/test_encoding.py Normal file
View File

@@ -0,0 +1,29 @@
from libmproxy import encoding
import libpry
import cStringIO
class uidentity(libpry.AutoTree):
def test_simple(self):
assert "string" == encoding.decode("identity", "string")
assert "string" == encoding.encode("identity", "string")
def test_fallthrough(self):
assert None == encoding.decode("nonexistent encoding", "string")
class ugzip(libpry.AutoTree):
def test_simple(self):
assert "string" == encoding.decode("gzip", encoding.encode("gzip", "string"))
assert None == encoding.decode("gzip", "bogus")
class udeflate(libpry.AutoTree):
def test_simple(self):
assert "string" == encoding.decode("deflate", encoding.encode("deflate", "string"))
assert "string" == encoding.decode("deflate", encoding.encode("deflate", "string")[2:-4])
assert None == encoding.decode("deflate", "bogus")
tests = [
uidentity(),
ugzip(),
udeflate()
]

View File

@@ -1,5 +1,5 @@
import cStringIO
from libmproxy import filt, proxy, utils
from libmproxy import filt, flow
import libpry
@@ -72,10 +72,10 @@ class uParsing(libpry.AutoTree):
class uMatching(libpry.AutoTree):
def req(self):
conn = proxy.ClientConnect(("one", 2222))
headers = utils.Headers()
conn = flow.ClientConnect(("one", 2222))
headers = flow.Headers()
headers["header"] = ["qvalue"]
return proxy.Request(
return flow.Request(
conn,
"host",
80,
@@ -88,9 +88,9 @@ class uMatching(libpry.AutoTree):
def resp(self):
q = self.req()
headers = utils.Headers()
headers = flow.Headers()
headers["header_response"] = ["svalue"]
return proxy.Response(
return flow.Response(
q,
200,
"message",

View File

@@ -1,6 +1,7 @@
import Queue
import Queue, time, textwrap
from cStringIO import StringIO
from libmproxy import console, proxy, filt, flow, controller
import email.utils
from libmproxy import console, proxy, filt, flow, controller, utils
import tutils
import libpry
@@ -130,29 +131,12 @@ class uServerPlaybackState(libpry.AutoTree):
class uFlow(libpry.AutoTree):
def test_run_script(self):
f = tutils.tflow()
f.response = tutils.tresp()
f.request = f.response.request
se = f.run_script("scripts/a")
assert "DEBUG" == se.strip()
assert f.request.host == "TESTOK"
def test_run_script_err(self):
f = tutils.tflow()
f.response = tutils.tresp()
f.request = f.response.request
libpry.raises("returned error", f.run_script,"scripts/err_return")
libpry.raises("invalid response", f.run_script,"scripts/err_data")
libpry.raises("no such file", f.run_script,"nonexistent")
libpry.raises("permission denied", f.run_script,"scripts/nonexecutable")
def test_match(self):
f = tutils.tflow()
f.response = tutils.tresp()
f.request = f.response.request
assert not f.match(filt.parse("~b test"))
assert not f.match(None)
assert f.match(None)
def test_backup(self):
f = tutils.tflow()
@@ -169,19 +153,19 @@ class uFlow(libpry.AutoTree):
def test_getset_state(self):
f = tutils.tflow()
f.response = tutils.tresp(f.request)
state = f.get_state()
assert f == flow.Flow.from_state(state)
state = f._get_state()
assert f._get_state() == flow.Flow._from_state(state)._get_state()
f.response = None
f.error = proxy.Error(f.request, "error")
state = f.get_state()
assert f == flow.Flow.from_state(state)
f.error = flow.Error(f.request, "error")
state = f._get_state()
assert f._get_state() == flow.Flow._from_state(state)._get_state()
f2 = tutils.tflow()
f2.error = proxy.Error(f.request, "e2")
f2.error = flow.Error(f.request, "e2")
assert not f == f2
f.load_state(f2.get_state())
assert f == f2
f._load_state(f2._get_state())
assert f._get_state() == f2._get_state()
def test_kill(self):
s = flow.State()
@@ -195,11 +179,27 @@ class uFlow(libpry.AutoTree):
f.intercept()
f.response = tutils.tresp()
f.request = f.response.request
f.request.ack()
f.request._ack()
assert not f.response.acked
f.kill(fm)
assert f.response.acked
def test_killall(self):
s = flow.State()
fm = flow.FlowMaster(None, s)
r = tutils.treq()
fm.handle_request(r)
r = tutils.treq()
fm.handle_request(r)
for i in s.view:
assert not i.request.acked
s.killall(fm)
for i in s.view:
assert i.request.acked
def test_accept_intercept(self):
f = tutils.tflow()
f.request = tutils.treq()
@@ -210,7 +210,7 @@ class uFlow(libpry.AutoTree):
f.response = tutils.tresp()
f.request = f.response.request
f.intercept()
f.request.ack()
f.request._ack()
assert not f.response.acked
f.accept_intercept()
assert f.response.acked
@@ -219,10 +219,28 @@ class uFlow(libpry.AutoTree):
f = flow.Flow(None)
f.request = tutils.treq()
def test_replace(self):
f = tutils.tflow_full()
f.request.headers["foo"] = ["foo"]
f.request.content = "afoob"
f.response.headers["foo"] = ["foo"]
f.response.content = "afoob"
assert f.replace("foo", "bar") == 6
assert f.request.headers["bar"] == ["bar"]
assert f.request.content == "abarb"
assert f.response.headers["bar"] == ["bar"]
assert f.response.content == "abarb"
f = tutils.tflow_err()
f.replace("error", "bar")
assert f.error.msg == "bar"
class uState(libpry.AutoTree):
def test_backup(self):
bc = proxy.ClientConnect(("address", 22))
c = flow.State()
req = tutils.treq()
f = c.add_request(req)
@@ -236,64 +254,59 @@ class uState(libpry.AutoTree):
connect -> request -> response
"""
bc = proxy.ClientConnect(("address", 22))
bc = flow.ClientConnect(("address", 22))
c = flow.State()
c.clientconnect(bc)
assert len(c.client_connections) == 1
req = tutils.treq(bc)
f = c.add_request(req)
assert f
assert c.flow_count() == 1
assert c.flow_map.get(req)
assert c._flow_map.get(req)
assert c.active_flow_count() == 1
newreq = tutils.treq()
assert c.add_request(newreq)
assert c.flow_map.get(newreq)
assert c._flow_map.get(newreq)
assert c.active_flow_count() == 2
resp = tutils.tresp(req)
assert c.add_response(resp)
assert c.flow_count() == 2
assert c.flow_map.get(resp.request)
assert c._flow_map.get(resp.request)
assert c.active_flow_count() == 1
unseen_resp = tutils.tresp()
assert not c.add_response(unseen_resp)
assert not c.flow_map.get(unseen_resp.request)
assert not c._flow_map.get(unseen_resp.request)
assert c.active_flow_count() == 1
resp = tutils.tresp(newreq)
assert c.add_response(resp)
assert c.active_flow_count() == 0
dc = proxy.ClientDisconnect(bc)
c.clientdisconnect(dc)
assert not c.client_connections
def test_err(self):
bc = proxy.ClientConnect(("address", 22))
c = flow.State()
req = tutils.treq()
f = c.add_request(req)
e = proxy.Error(f.request, "message")
e = flow.Error(f.request, "message")
assert c.add_error(e)
e = proxy.Error(tutils.tflow().request, "message")
e = flow.Error(tutils.tflow().request, "message")
assert not c.add_error(e)
def test_set_limit(self):
c = flow.State()
req = tutils.treq()
c.clientconnect(req.client_conn)
assert len(c.view) == 0
f = c.add_request(req)
c.add_request(req)
assert len(c.view) == 1
c.set_limit("~s")
assert c.limit_txt == "~s"
assert len(c.view) == 0
resp = tutils.tresp(req)
c.add_response(resp)
@@ -302,7 +315,6 @@ class uState(libpry.AutoTree):
assert len(c.view) == 1
req = tutils.treq()
c.clientconnect(req.client_conn)
c.add_request(req)
assert len(c.view) == 2
c.set_limit("~q")
@@ -327,14 +339,14 @@ class uState(libpry.AutoTree):
def _add_response(self, state):
req = tutils.treq()
f = state.add_request(req)
state.add_request(req)
resp = tutils.tresp(req)
state.add_response(resp)
def _add_error(self, state):
req = tutils.treq()
f = state.add_request(req)
f.error = proxy.Error(f.request, "msg")
f.error = flow.Error(f.request, "msg")
def test_clear(self):
c = flow.State()
@@ -358,7 +370,7 @@ class uState(libpry.AutoTree):
c.clear()
c.load_flows(flows)
assert isinstance(c.flow_list[0], flow.Flow)
assert isinstance(c._flow_list[0], flow.Flow)
def test_accept_all(self):
c = flow.State()
@@ -385,6 +397,7 @@ class uSerialize(libpry.AutoTree):
def test_roundtrip(self):
sio = StringIO()
f = tutils.tflow()
f.request.content = "".join(chr(i) for i in range(255))
w = flow.FlowWriter(sio)
w.add(f)
@@ -392,14 +405,17 @@ class uSerialize(libpry.AutoTree):
r = flow.FlowReader(sio)
l = list(r.stream())
assert len(l) == 1
assert l[0] == f
f2 = l[0]
assert f2._get_state() == f._get_state()
assert f2.request._assemble() == f.request._assemble()
def test_load_flows(self):
r = self._treader()
s = flow.State()
fm = flow.FlowMaster(None, s)
fm.load_flows(r)
assert len(s.flow_list) == 6
assert len(s._flow_list) == 6
def test_error(self):
@@ -414,12 +430,38 @@ class uSerialize(libpry.AutoTree):
class uFlowMaster(libpry.AutoTree):
def test_load_script(self):
s = flow.State()
fm = flow.FlowMaster(None, s)
assert not fm.load_script("scripts/a.py")
assert fm.load_script("nonexistent")
assert "ValueError" in fm.load_script("scripts/starterr.py")
def test_script(self):
s = flow.State()
fm = flow.FlowMaster(None, s)
assert not fm.load_script("scripts/all.py")
req = tutils.treq()
fm.handle_clientconnect(req.client_conn)
assert fm.script.ns["log"][-1] == "clientconnect"
f = fm.handle_request(req)
assert fm.script.ns["log"][-1] == "request"
resp = tutils.tresp(req)
fm.handle_response(resp)
assert fm.script.ns["log"][-1] == "response"
dc = flow.ClientDisconnect(req.client_conn)
fm.handle_clientdisconnect(dc)
assert fm.script.ns["log"][-1] == "clientdisconnect"
err = flow.Error(f.request, "msg")
fm.handle_error(err)
assert fm.script.ns["log"][-1] == "error"
def test_all(self):
s = flow.State()
fm = flow.FlowMaster(None, s)
fm.anticache = True
fm.anticomp = True
req = tutils.treq()
fm.handle_clientconnect(req.client_conn)
f = fm.handle_request(req)
@@ -432,10 +474,10 @@ class uFlowMaster(libpry.AutoTree):
rx = tutils.tresp()
assert not fm.handle_response(rx)
dc = proxy.ClientDisconnect(req.client_conn)
dc = flow.ClientDisconnect(req.client_conn)
fm.handle_clientdisconnect(dc)
err = proxy.Error(f.request, "msg")
err = flow.Error(f.request, "msg")
fm.handle_error(err)
def test_client_playback(self):
@@ -452,7 +494,7 @@ class uFlowMaster(libpry.AutoTree):
fm.tick(q)
assert fm.state.flow_count()
fm.handle_error(proxy.Error(f.request, "error"))
fm.handle_error(flow.Error(f.request, "error"))
def test_server_playback(self):
s = flow.State()
@@ -476,7 +518,7 @@ class uFlowMaster(libpry.AutoTree):
assert fm.do_server_playback(tutils.tflow())
q = Queue.Queue()
fm.tick(q)
assert controller.exit
assert controller.should_exit
fm.stop_server_playback()
assert not fm.server_playback
@@ -494,7 +536,7 @@ class uFlowMaster(libpry.AutoTree):
tf = tutils.tflow_full()
tf.response.headers["set-cookie"] = ["foo=bar"]
fm.handle_request(tf.request)
f = fm.handle_response(tf.response)
fm.handle_response(tf.response)
assert fm.stickycookie_state.jar
assert not "cookie" in tf.request.headers
fm.handle_request(tf.request)
@@ -520,6 +562,317 @@ class uFlowMaster(libpry.AutoTree):
fm.handle_request(f.request)
assert f.request.headers["authorization"] == ["foo"]
class uRequest(libpry.AutoTree):
def test_simple(self):
h = flow.Headers()
h["test"] = ["test"]
c = flow.ClientConnect(("addr", 2222))
r = flow.Request(c, "host", 22, "https", "GET", "/", h, "content")
u = r.get_url()
assert r.set_url(u)
assert not r.set_url("")
assert r.get_url() == u
assert r._assemble()
r2 = r.copy()
assert r == r2
def test_anticache(self):
h = flow.Headers()
r = flow.Request(None, "host", 22, "https", "GET", "/", h, "content")
h["if-modified-since"] = ["test"]
h["if-none-match"] = ["test"]
r.anticache()
assert not "if-modified-since" in r.headers
assert not "if-none-match" in r.headers
def test_getset_state(self):
h = flow.Headers()
h["test"] = ["test"]
c = flow.ClientConnect(("addr", 2222))
r = flow.Request(c, "host", 22, "https", "GET", "/", h, "content")
state = r._get_state()
assert flow.Request._from_state(state) == r
r.client_conn = None
state = r._get_state()
assert flow.Request._from_state(state) == r
r2 = flow.Request(c, "testing", 20, "http", "PUT", "/foo", h, "test")
assert not r == r2
r._load_state(r2._get_state())
assert r == r2
r2.client_conn = None
r._load_state(r2._get_state())
assert not r.client_conn
def test_replace(self):
r = tutils.treq()
r.path = "path/foo"
r.headers["Foo"] = ["fOo"]
r.content = "afoob"
assert r.replace("foo(?i)", "boo") == 4
assert r.path == "path/boo"
assert not "foo" in r.content
assert r.headers["boo"] == ["boo"]
def test_decodeencode(self):
r = tutils.treq()
r.headers["content-encoding"] = ["identity"]
r.content = "falafel"
r.decode()
assert not r.headers["content-encoding"]
assert r.content == "falafel"
r = tutils.treq()
r.headers["content-encoding"] = ["identity"]
r.content = "falafel"
r.encode("identity")
assert r.headers["content-encoding"] == ["identity"]
assert r.content == "falafel"
r = tutils.treq()
r.headers["content-encoding"] = ["identity"]
r.content = "falafel"
r.encode("gzip")
assert r.headers["content-encoding"] == ["gzip"]
assert r.content != "falafel"
r.decode()
assert not r.headers["content-encoding"]
assert r.content == "falafel"
class uResponse(libpry.AutoTree):
def test_simple(self):
h = flow.Headers()
h["test"] = ["test"]
c = flow.ClientConnect(("addr", 2222))
req = flow.Request(c, "host", 22, "https", "GET", "/", h, "content")
resp = flow.Response(req, 200, "msg", h.copy(), "content")
assert resp._assemble()
resp2 = resp.copy()
assert resp2 == resp
def test_refresh(self):
r = tutils.tresp()
n = time.time()
r.headers["date"] = [email.utils.formatdate(n)]
pre = r.headers["date"]
r.refresh(n)
assert pre == r.headers["date"]
r.refresh(n+60)
d = email.utils.parsedate_tz(r.headers["date"][0])
d = email.utils.mktime_tz(d)
# Weird that this is not exact...
assert abs(60-(d-n)) <= 1
r.headers["set-cookie"] = ["MOO=BAR; Expires=Tue, 08-Mar-2011 00:20:38 GMT; Path=foo.com; Secure"]
r.refresh()
def test_refresh_cookie(self):
r = tutils.tresp()
# Invalid expires format, sent to us by Reddit.
c = "rfoo=bar; Domain=reddit.com; expires=Thu, 31 Dec 2037 23:59:59 GMT; Path=/"
assert r._refresh_cookie(c, 60)
c = "MOO=BAR; Expires=Tue, 08-Mar-2011 00:20:38 GMT; Path=foo.com; Secure"
assert "00:21:38" in r._refresh_cookie(c, 60)
def test_getset_state(self):
h = flow.Headers()
h["test"] = ["test"]
c = flow.ClientConnect(("addr", 2222))
req = flow.Request(c, "host", 22, "https", "GET", "/", h, "content")
resp = flow.Response(req, 200, "msg", h.copy(), "content")
state = resp._get_state()
assert flow.Response._from_state(req, state) == resp
resp2 = flow.Response(req, 220, "foo", h.copy(), "test")
assert not resp == resp2
resp._load_state(resp2._get_state())
assert resp == resp2
def test_replace(self):
r = tutils.tresp()
r.headers["Foo"] = ["fOo"]
r.content = "afoob"
assert r.replace("foo(?i)", "boo") == 3
assert not "foo" in r.content
assert r.headers["boo"] == ["boo"]
def test_decodeencode(self):
r = tutils.tresp()
r.headers["content-encoding"] = ["identity"]
r.content = "falafel"
r.decode()
assert not r.headers["content-encoding"]
assert r.content == "falafel"
r = tutils.tresp()
r.headers["content-encoding"] = ["identity"]
r.content = "falafel"
r.encode("identity")
assert r.headers["content-encoding"] == ["identity"]
assert r.content == "falafel"
r = tutils.tresp()
r.headers["content-encoding"] = ["identity"]
r.content = "falafel"
r.encode("gzip")
assert r.headers["content-encoding"] == ["gzip"]
assert r.content != "falafel"
r.decode()
assert not r.headers["content-encoding"]
assert r.content == "falafel"
class uError(libpry.AutoTree):
def test_getset_state(self):
e = flow.Error(None, "Error")
state = e._get_state()
assert flow.Error._from_state(state) == e
assert e.copy()
e2 = flow.Error(None, "bar")
assert not e == e2
e._load_state(e2._get_state())
assert e == e2
e3 = e.copy()
assert e3 == e
def test_replace(self):
e = flow.Error(None, "amoop")
e.replace("moo", "bar")
assert e.msg == "abarp"
class uClientConnect(libpry.AutoTree):
def test_state(self):
c = flow.ClientConnect(("a", 22))
assert flow.ClientConnect._from_state(c._get_state()) == c
c2 = flow.ClientConnect(("a", 25))
assert not c == c2
c._load_state(c2._get_state())
assert c == c2
c3 = c.copy()
assert c3 == c
class uHeaders(libpry.AutoTree):
def setUp(self):
self.hd = flow.Headers()
def test_read_simple(self):
data = """
Header: one
Header2: two
\r\n
"""
data = textwrap.dedent(data)
data = data.strip()
s = StringIO(data)
self.hd.read(s)
assert self.hd["header"] == ["one"]
assert self.hd["header2"] == ["two"]
def test_read_multi(self):
data = """
Header: one
Header: two
\r\n
"""
data = textwrap.dedent(data)
data = data.strip()
s = StringIO(data)
self.hd.read(s)
assert self.hd["header"] == ["one", "two"]
def test_read_continued(self):
data = """
Header: one
\ttwo
Header2: three
\r\n
"""
data = textwrap.dedent(data)
data = data.strip()
s = StringIO(data)
self.hd.read(s)
assert self.hd["header"] == ['one\r\n two']
def test_dictToHeader1(self):
self.hd.add("one", "uno")
self.hd.add("two", "due")
self.hd.add("two", "tre")
expected = [
"one: uno\r\n",
"two: due\r\n",
"two: tre\r\n",
"\r\n"
]
out = repr(self.hd)
for i in expected:
assert out.find(i) >= 0
def test_dictToHeader2(self):
self.hd["one"] = ["uno"]
expected1 = "one: uno\r\n"
expected2 = "\r\n"
out = repr(self.hd)
assert out.find(expected1) >= 0
assert out.find(expected2) >= 0
def test_match_re(self):
h = flow.Headers()
h.add("one", "uno")
h.add("two", "due")
h.add("two", "tre")
assert h.match_re("uno")
assert h.match_re("two: due")
assert not h.match_re("nonono")
def test_getset_state(self):
self.hd.add("foo", 1)
self.hd.add("foo", 2)
self.hd.add("bar", 3)
state = self.hd._get_state()
nd = flow.Headers._from_state(state)
assert nd == self.hd
def test_copy(self):
self.hd.add("foo", 1)
self.hd.add("foo", 2)
self.hd.add("bar", 3)
assert self.hd == self.hd.copy()
def test_del(self):
self.hd.add("foo", 1)
self.hd.add("Foo", 2)
self.hd.add("bar", 3)
del self.hd["foo"]
assert len(self.hd.lst) == 1
def test_replace(self):
self.hd.add("one", "two")
self.hd.add("two", "one")
assert self.hd.replace("one", "vun") == 2
assert self.hd.lst == [
["vun", "two"],
["two", "vun"],
]
tests = [
uStickyCookieState(),
@@ -530,4 +883,9 @@ tests = [
uState(),
uSerialize(),
uFlowMaster(),
uRequest(),
uResponse(),
uError(),
uClientConnect(),
uHeaders(),
]

View File

@@ -9,10 +9,10 @@ class uNetstring(libpry.AutoTree):
self.encoded_data = "9:Netstring,6:module,2:by,4:Will,7:McGugan,"
def test_header(self):
tests = [ ("netstring", "9:"),
t = [ ("netstring", "9:"),
("Will McGugan", "12:"),
("", "0:") ]
for test, result in tests:
for test, result in t:
assert netstring.header(test) == result
def test_file_encoder(self):

View File

@@ -1,8 +1,6 @@
import cStringIO, time
import libpry
from libmproxy import proxy, controller, utils, dump, script
import email.utils
import tutils
from libmproxy import proxy, controller, utils, dump
class u_read_chunked(libpry.AutoTree):
@@ -48,30 +46,6 @@ class u_parse_request_line(libpry.AutoTree):
assert proxy.parse_request_line(u) == ('GET', None, None, None, '/', 1)
class u_parse_url(libpry.AutoTree):
def test_simple(self):
assert not proxy.parse_url("")
u = "http://foo.com:8888/test"
s, h, po, pa = proxy.parse_url(u)
assert s == "http"
assert h == "foo.com"
assert po == 8888
assert pa == "/test"
s, h, po, pa = proxy.parse_url("http://foo/bar")
assert s == "http"
assert h == "foo"
assert po == 80
assert pa == "/bar"
s, h, po, pa = proxy.parse_url("http://foo")
assert pa == "/"
s, h, po, pa = proxy.parse_url("https://foo")
assert po == 443
class uFileLike(libpry.AutoTree):
def test_wrap(self):
s = cStringIO.StringIO("foobar\nfoobar")
@@ -83,126 +57,6 @@ class uFileLike(libpry.AutoTree):
assert s.isatty
class uRequest(libpry.AutoTree):
def test_simple(self):
h = utils.Headers()
h["test"] = ["test"]
c = proxy.ClientConnect(("addr", 2222))
r = proxy.Request(c, "host", 22, "https", "GET", "/", h, "content")
u = r.url()
assert r.set_url(u)
assert not r.set_url("")
assert r.url() == u
assert r.assemble()
r2 = r.copy()
assert r == r2
def test_anticache(self):
h = utils.Headers()
r = proxy.Request(None, "host", 22, "https", "GET", "/", h, "content")
h["if-modified-since"] = ["test"]
h["if-none-match"] = ["test"]
r.anticache()
assert not "if-modified-since" in r.headers
assert not "if-none-match" in r.headers
def test_getset_state(self):
h = utils.Headers()
h["test"] = ["test"]
c = proxy.ClientConnect(("addr", 2222))
r = proxy.Request(c, "host", 22, "https", "GET", "/", h, "content")
state = r.get_state()
assert proxy.Request.from_state(state) == r
r.client_conn = None
state = r.get_state()
assert proxy.Request.from_state(state) == r
r2 = proxy.Request(c, "testing", 20, "http", "PUT", "/foo", h, "test")
assert not r == r2
r.load_state(r2.get_state())
assert r == r2
r2.client_conn = None
r.load_state(r2.get_state())
assert not r.client_conn
class uResponse(libpry.AutoTree):
def test_simple(self):
h = utils.Headers()
h["test"] = ["test"]
c = proxy.ClientConnect(("addr", 2222))
req = proxy.Request(c, "host", 22, "https", "GET", "/", h, "content")
resp = proxy.Response(req, 200, "msg", h.copy(), "content")
assert resp.assemble()
resp2 = resp.copy()
assert resp2 == resp
def test_refresh(self):
r = tutils.tresp()
n = time.time()
r.headers["date"] = [email.utils.formatdate(n)]
pre = r.headers["date"]
r.refresh(n)
assert pre == r.headers["date"]
r.refresh(n+60)
d = email.utils.parsedate_tz(r.headers["date"][0])
d = email.utils.mktime_tz(d)
# Weird that this is not exact...
assert abs(60-(d-n)) <= 1
r.headers["set-cookie"] = ["MOO=BAR; Expires=Tue, 08-Mar-2011 00:20:38 GMT; Path=foo.com; Secure"]
r.refresh()
def test_refresh_cookie(self):
r = tutils.tresp()
# Invalid expires format, sent to us by Reddit.
c = "rfoo=bar; Domain=reddit.com; expires=Thu, 31 Dec 2037 23:59:59 GMT; Path=/"
assert r._refresh_cookie(c, 60)
c = "MOO=BAR; Expires=Tue, 08-Mar-2011 00:20:38 GMT; Path=foo.com; Secure"
assert "00:21:38" in r._refresh_cookie(c, 60)
def test_getset_state(self):
h = utils.Headers()
h["test"] = ["test"]
c = proxy.ClientConnect(("addr", 2222))
r = proxy.Request(c, "host", 22, "https", "GET", "/", h, "content")
req = proxy.Request(c, "host", 22, "https", "GET", "/", h, "content")
resp = proxy.Response(req, 200, "msg", h.copy(), "content")
state = resp.get_state()
assert proxy.Response.from_state(req, state) == resp
resp2 = proxy.Response(req, 220, "foo", h.copy(), "test")
assert not resp == resp2
resp.load_state(resp2.get_state())
assert resp == resp2
class uError(libpry.AutoTree):
def test_getset_state(self):
e = proxy.Error(None, "Error")
state = e.get_state()
assert proxy.Error.from_state(state) == e
assert e.copy()
e2 = proxy.Error(None, "bar")
assert not e == e2
e.load_state(e2.get_state())
assert e == e2
e3 = e.copy()
assert e3 == e
class uProxyError(libpry.AutoTree):
def test_simple(self):
@@ -210,30 +64,9 @@ class uProxyError(libpry.AutoTree):
assert repr(p)
class uClientConnect(libpry.AutoTree):
def test_state(self):
c = proxy.ClientConnect(("a", 22))
assert proxy.ClientConnect.from_state(c.get_state()) == c
c2 = proxy.ClientConnect(("a", 25))
assert not c == c2
c.load_state(c2.get_state())
assert c == c2
c3 = c.copy()
assert c3 == c
tests = [
uProxyError(),
uRequest(),
uResponse(),
uFileLike(),
u_parse_request_line(),
u_parse_url(),
uError(),
uClientConnect(),
u_read_chunked(),
]

60
test/test_script.py Normal file
View File

@@ -0,0 +1,60 @@
import os
from libmproxy import script, flow
import libpry
class uScript(libpry.AutoTree):
def test_simple(self):
s = flow.State()
fm = flow.FlowMaster(None, s)
ctx = flow.ScriptContext(fm)
p = script.Script(os.path.join("scripts", "a.py"), ctx)
p.load()
assert "here" in p.ns
assert p.run("here") == (True, 1)
assert p.run("here") == (True, 2)
ret = p.run("errargs")
assert not ret[0]
assert len(ret[1]) == 2
# Check reload
p.load()
assert p.run("here") == (True, 1)
def test_err(self):
s = flow.State()
fm = flow.FlowMaster(None, s)
ctx = flow.ScriptContext(fm)
s = script.Script("nonexistent", ctx)
libpry.raises(
"no such file",
s.load
)
s = script.Script("scripts", ctx)
libpry.raises(
"not a file",
s.load
)
s = script.Script("scripts/syntaxerr.py", ctx)
libpry.raises(
script.ScriptError,
s.load
)
s = script.Script("scripts/loaderr.py", ctx)
libpry.raises(
script.ScriptError,
s.load
)
tests = [
uScript(),
]

View File

@@ -44,7 +44,7 @@ class uProxy(tutils.ProxTest):
l = self.log()
assert l[0].address
assert l[1].headers.has_key("host")
assert "host" in l[1].headers
assert l[2].code == 200
def test_https(self):
@@ -55,7 +55,7 @@ class uProxy(tutils.ProxTest):
l = self.log()
assert l[0].address
assert l[1].headers.has_key("host")
assert "host" in l[1].headers
assert l[2].code == 200
# Disable these two for now: they take a long time.

View File

@@ -1,4 +1,4 @@
import textwrap, cStringIO, os, time, re
import textwrap, cStringIO, os, time, re, json
import libpry
from libmproxy import utils
@@ -29,201 +29,24 @@ class uhexdump(libpry.AutoTree):
def test_simple(self):
assert utils.hexdump("one\0"*10)
class udel_all(libpry.AutoTree):
def test_simple(self):
d = dict(a=1, b=2, c=3)
utils.del_all(d, ["a", "x", "b"])
assert d.keys() == ["c"]
class upretty_size(libpry.AutoTree):
def test_simple(self):
assert utils.pretty_size(100) == "100B"
assert utils.pretty_size(1024) == "1kB"
assert utils.pretty_size(1024 + (1024/2)) == "1.5kB"
assert utils.pretty_size(1024 + (1024/2.0)) == "1.5kB"
assert utils.pretty_size(1024*1024) == "1M"
class uData(libpry.AutoTree):
def test_nonexistent(self):
libpry.raises("does not exist", utils.data.path, "nonexistent")
class uMultiDict(libpry.AutoTree):
def setUp(self):
self.md = utils.MultiDict()
def test_setget(self):
assert not self.md.has_key("foo")
self.md.append("foo", 1)
assert self.md["foo"] == [1]
assert self.md.has_key("foo")
def test_del(self):
self.md.append("foo", 1)
del self.md["foo"]
assert not self.md.has_key("foo")
def test_extend(self):
self.md.append("foo", 1)
self.md.extend("foo", [2, 3])
assert self.md["foo"] == [1, 2, 3]
def test_extend_err(self):
self.md.append("foo", 1)
libpry.raises("not iterable", self.md.extend, "foo", 2)
def test_get(self):
self.md.append("foo", 1)
self.md.append("foo", 2)
assert self.md.get("foo") == [1, 2]
assert self.md.get("bar") == None
def test_caseSensitivity(self):
self.md._helper = (utils._caseless,)
self.md["foo"] = [1]
self.md.append("FOO", 2)
assert self.md["foo"] == [1, 2]
assert self.md["FOO"] == [1, 2]
assert self.md.has_key("FoO")
def test_dict(self):
self.md.append("foo", 1)
self.md.append("foo", 2)
self.md["bar"] = [3]
assert self.md == self.md
assert dict(self.md) == self.md
def test_copy(self):
self.md["foo"] = [1, 2]
self.md["bar"] = [3, 4]
md2 = self.md.copy()
assert md2 == self.md
assert id(md2) != id(self.md)
def test_clear(self):
self.md["foo"] = [1, 2]
self.md["bar"] = [3, 4]
self.md.clear()
assert not self.md.keys()
def test_setitem(self):
libpry.raises(ValueError, self.md.__setitem__, "foo", "bar")
self.md["foo"] = ["bar"]
assert self.md["foo"] == ["bar"]
def test_itemPairs(self):
self.md.append("foo", 1)
self.md.append("foo", 2)
self.md.append("bar", 3)
l = list(self.md.itemPairs())
assert len(l) == 3
assert ("foo", 1) in l
assert ("foo", 2) in l
assert ("bar", 3) in l
def test_getset_state(self):
self.md.append("foo", 1)
self.md.append("foo", 2)
self.md.append("bar", 3)
state = self.md.get_state()
nd = utils.MultiDict.from_state(state)
assert nd == self.md
class uHeaders(libpry.AutoTree):
def setUp(self):
self.hd = utils.Headers()
def test_read_simple(self):
data = """
Header: one
Header2: two
\r\n
"""
data = textwrap.dedent(data)
data = data.strip()
s = cStringIO.StringIO(data)
self.hd.read(s)
assert self.hd["header"] == ["one"]
assert self.hd["header2"] == ["two"]
def test_read_multi(self):
data = """
Header: one
Header: two
\r\n
"""
data = textwrap.dedent(data)
data = data.strip()
s = cStringIO.StringIO(data)
self.hd.read(s)
assert self.hd["header"] == ["one", "two"]
def test_read_continued(self):
data = """
Header: one
\ttwo
Header2: three
\r\n
"""
data = textwrap.dedent(data)
data = data.strip()
s = cStringIO.StringIO(data)
self.hd.read(s)
assert self.hd["header"] == ['one\r\n two']
def test_dictToHeader1(self):
self.hd.append("one", "uno")
self.hd.append("two", "due")
self.hd.append("two", "tre")
expected = [
"one: uno\r\n",
"two: due\r\n",
"two: tre\r\n",
"\r\n"
]
out = repr(self.hd)
for i in expected:
assert out.find(i) >= 0
def test_dictToHeader2(self):
self.hd["one"] = ["uno"]
expected1 = "one: uno\r\n"
expected2 = "\r\n"
out = repr(self.hd)
assert out.find(expected1) >= 0
assert out.find(expected2) >= 0
def test_match_re(self):
h = utils.Headers()
h.append("one", "uno")
h.append("two", "due")
h.append("two", "tre")
assert h.match_re("uno")
assert h.match_re("two: due")
assert not h.match_re("nonono")
def test_getset_state(self):
self.hd.append("foo", 1)
self.hd.append("foo", 2)
self.hd.append("bar", 3)
state = self.hd.get_state()
nd = utils.Headers.from_state(state)
assert nd == self.hd
class uisStringLike(libpry.AutoTree):
def test_all(self):
assert utils.isStringLike("foo")
assert not utils.isStringLike([1, 2, 3])
assert not utils.isStringLike((1, 2, 3))
assert not utils.isStringLike(["1", "2", "3"])
class uisSequenceLike(libpry.AutoTree):
def test_all(self):
assert utils.isSequenceLike([1, 2, 3])
assert utils.isSequenceLike((1, 2, 3))
assert not utils.isSequenceLike("foobar")
assert utils.isSequenceLike(["foobar", "foo"])
x = iter([1, 2, 3])
assert utils.isSequenceLike(x)
assert not utils.isSequenceLike(1)
libpry.raises("does not exist", utils.pkg_data.path, "nonexistent")
@@ -283,6 +106,19 @@ class upretty_xmlish(libpry.AutoTree):
assert utils.pretty_xmlish(s) == ["gobbledygook"]
class upretty_json(libpry.AutoTree):
def test_one(self):
s = json.dumps({"foo": 1})
assert utils.pretty_json(s)
assert not utils.pretty_json("moo")
class u_urldecode(libpry.AutoTree):
def test_one(self):
s = "one=two&three=four"
assert len(utils.urldecode(s)) == 2
class udummy_ca(libpry.AutoTree):
def test_all(self):
d = self.tmpdir()
@@ -356,19 +192,43 @@ class uLRUCache(libpry.AutoTree):
assert len(f._cachelist_one) == 2
class u_parse_url(libpry.AutoTree):
def test_simple(self):
assert not utils.parse_url("")
u = "http://foo.com:8888/test"
s, h, po, pa = utils.parse_url(u)
assert s == "http"
assert h == "foo.com"
assert po == 8888
assert pa == "/test"
s, h, po, pa = utils.parse_url("http://foo/bar")
assert s == "http"
assert h == "foo"
assert po == 80
assert pa == "/bar"
s, h, po, pa = utils.parse_url("http://foo")
assert pa == "/"
s, h, po, pa = utils.parse_url("https://foo")
assert po == 443
tests = [
uformat_timestamp(),
uisBin(),
uisXML(),
uhexdump(),
upretty_size(),
uisStringLike(),
uisSequenceLike(),
uMultiDict(),
uHeaders(),
uData(),
upretty_xmlish(),
upretty_json(),
u_urldecode(),
udel_all(),
udummy_ca(),
udummy_cert(),
uLRUCache(),
u_parse_url()
]

View File

@@ -1,23 +1,23 @@
import os.path, threading, Queue
import libpry
from libmproxy import proxy, utils, filt, flow, controller
from libmproxy import proxy, filt, flow, controller
import serv, sslserv
import random
def treq(conn=None):
if not conn:
conn = proxy.ClientConnect(("address", 22))
headers = utils.Headers()
conn = flow.ClientConnect(("address", 22))
headers = flow.Headers()
headers["header"] = ["qvalue"]
return proxy.Request(conn, "host", 80, "http", "GET", "/path", headers, "content")
return flow.Request(conn, "host", 80, "http", "GET", "/path", headers, "content")
def tresp(req=None):
if not req:
req = treq()
headers = utils.Headers()
headers = flow.Headers()
headers["header_response"] = ["svalue"]
return proxy.Response(req, 200, "message", headers, "content_response")
return flow.Response(req, 200, "message", headers, "content_response")
def tflow():
@@ -35,7 +35,7 @@ def tflow_full():
def tflow_err():
r = treq()
f = flow.Flow(r)
f.error = proxy.Error(r, "error")
f.error = flow.Error(r, "error")
return f
@@ -60,13 +60,13 @@ class TestMaster(controller.Master):
def handle(self, m):
self.log.append(m)
m.ack()
m._ack()
class ProxyThread(threading.Thread):
def __init__(self, port, testq):
self.tmaster = TestMaster(port, testq)
controller.exit = False
controller.should_exit = False
threading.Thread.__init__(self)
def run(self):

17
todo
View File

@@ -1,15 +1,9 @@
This is a loose collection of todo items, in case someone else wants to start
hacking on mitmproxy. Drop me a line (aldo@corte.si) if you want to tackle any
of these and need some pointers.
Features:
- Support for content encodings. At the moment, we sidestep this by
stripping headers that indicate accepted encodings. In theory, this should
force servers to serve us content uncompressed, but that's not always the
case. Besides, we should try to modify requests as little as possible, so
direct support for deflate and gzip is the way to go.
- Add some "workspace" features to mitmproxy:
- Flow comments
- Copying/duplicating flows
@@ -17,17 +11,8 @@ Features:
- Post and URL field parsing and editing.
- We need a built-in graphical editor for these, that knows how to
break fields up and present them for modificatio individually.
- No-proxy mode, for when we want to just replay without serving as a proxy ourselves.
- Upstream proxy support.
- Strings view for binary responses.
- Support HTTP Digest authentication through the stickyauth option. We'll
have to save the server nonce, and recalculate the hashes for each request.
- Chunked encoding support for requests (we already support it for responses).
- Better manipulation of flows in mitmproxy: copying flows, adding
comments.
Bugs:
- In some circumstances, long URLs in list view are line-broken oddly.
- When a bug in mitmproxy causes a stack trace, we hang on exit.
- A progress indicator for large files