Compare commits
100 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
b8cb769693 | ||
|
|
5e096c8ec9 | ||
|
|
c298fbfadc | ||
|
|
24a8dc408c | ||
|
|
d7748cea4f | ||
|
|
8fa96d1f3f | ||
|
|
5553eb6371 | ||
|
|
4e2d19714c | ||
|
|
618a9c0e2b | ||
|
|
c1788c37a1 | ||
|
|
7d90eb65ed | ||
|
|
a4f7728fad | ||
|
|
f1dc3f2ab2 | ||
|
|
549512e93e | ||
|
|
8385e586c0 | ||
|
|
747e1f0992 | ||
|
|
5fdf710c81 | ||
|
|
4c6a445361 | ||
|
|
4e1bbc1156 | ||
|
|
786e304bb9 | ||
|
|
4da8054e21 | ||
|
|
99ac7b8401 | ||
|
|
8110a9a3eb | ||
|
|
d8cadd2ff3 | ||
|
|
79a0334a02 | ||
|
|
ab0e10e60f | ||
|
|
b9737ed89e | ||
|
|
c6896d7392 | ||
|
|
61fab03b24 | ||
|
|
f526e5fa12 | ||
|
|
4979a22d3e | ||
|
|
ab1d8fa350 | ||
|
|
bb03255da0 | ||
|
|
c02fdb2463 | ||
|
|
f13e2213ea | ||
|
|
42d06a05c0 | ||
|
|
7ed1c1c231 | ||
|
|
c6ee813479 | ||
|
|
2df2fe0e4c | ||
|
|
15cc09f1b8 | ||
|
|
7fef0ecdf5 | ||
|
|
35f4a1c424 | ||
|
|
585bf9423f | ||
|
|
6dc945571d | ||
|
|
a6df72cfc8 | ||
|
|
e9ac4bef20 | ||
|
|
690f797da2 | ||
|
|
a2fddb4404 | ||
|
|
d187965233 | ||
|
|
a050eeef05 | ||
|
|
b6725ee802 | ||
|
|
a979e1ad50 | ||
|
|
2240d2a6a5 | ||
|
|
74c51df580 | ||
|
|
62e51018d0 | ||
|
|
0d05068f91 | ||
|
|
ed74ed24a0 | ||
|
|
45ab22f0d9 | ||
|
|
1441fade90 | ||
|
|
2153835545 | ||
|
|
2739cb4861 | ||
|
|
bc3ba4c993 | ||
|
|
50630d62fd | ||
|
|
0de97ad9e0 | ||
|
|
65e88f49d4 | ||
|
|
5690e7c399 | ||
|
|
18d0e840b5 | ||
|
|
552146d015 | ||
|
|
ac6987c54a | ||
|
|
76175672ad | ||
|
|
c8ae1e85b3 | ||
|
|
08f410cacc | ||
|
|
d138af7217 | ||
|
|
d51b8cab0c | ||
|
|
8d662e6636 | ||
|
|
fa6305ee98 | ||
|
|
fdffb23989 | ||
|
|
22d4559a7a | ||
|
|
4e13ab1d05 | ||
|
|
d57a1d6035 | ||
|
|
3f2d1381d0 | ||
|
|
d3aad7a185 | ||
|
|
23f7214fc3 | ||
|
|
e67dbf6123 | ||
|
|
041eafba73 | ||
|
|
5b5b79f5c4 | ||
|
|
93565392cd | ||
|
|
ed56d67cea | ||
|
|
e1356dd2b6 | ||
|
|
1790246fed | ||
|
|
15ad7704d2 | ||
|
|
533f61f67a | ||
|
|
8b841bc9e3 | ||
|
|
0bed5fae27 | ||
|
|
a03e1af7e7 | ||
|
|
688faa9baa | ||
|
|
764724748b | ||
|
|
2c73e8f816 | ||
|
|
2ba8296843 | ||
|
|
00942c1431 |
2
.gitignore
vendored
@@ -3,7 +3,7 @@ MANIFEST
|
||||
/dist
|
||||
/tmp
|
||||
/doc
|
||||
*.py[cd]
|
||||
*.py[cdo]
|
||||
*.swp
|
||||
*.swo
|
||||
mitmproxyc
|
||||
|
||||
27
CHANGELOG
@@ -1,4 +1,31 @@
|
||||
|
||||
5 April 2012: mitmproxy 0.8:
|
||||
|
||||
* Detailed tutorial for Android interception. Some features that land in
|
||||
this release have finally made reliable Android interception possible.
|
||||
|
||||
* Upstream-cert mode, which uses information from the upstream server to
|
||||
generate interception certificates.
|
||||
|
||||
* Replacement patterns that let you easily do global replacements in flows
|
||||
matching filter patterns. Can be specified on the command-line, or edited
|
||||
interactively.
|
||||
|
||||
* Much more sophisticated and usable pretty printing of request bodies.
|
||||
Support for auto-indentation of Javascript, inspection of image EXIF
|
||||
data, and more.
|
||||
|
||||
* Details view for flows, showing connection and SSL cert information (X
|
||||
keyboard shortcut).
|
||||
|
||||
* Server certificates are now stored and serialized in saved traffic for
|
||||
later analysis. This means that the 0.8 serialization format is NOT
|
||||
compatible with 0.7.
|
||||
|
||||
* Many other improvements, including bugfixes, and expanded scripting API,
|
||||
and more sophisticated certificate handling.
|
||||
|
||||
|
||||
20 February 2012: mitmproxy 0.7:
|
||||
|
||||
* New built-in key/value editor. This lets you interactively edit URL query
|
||||
|
||||
@@ -1,14 +1,17 @@
|
||||
395 Aldo Cortesi
|
||||
485 Aldo Cortesi
|
||||
18 Henrik Nordstrom
|
||||
13 Thomas Roth
|
||||
11 Stephen Altamirano
|
||||
5 András Veres-Szentkirályi
|
||||
6 András Veres-Szentkirályi
|
||||
4 Valtteri Virtanen
|
||||
2 alts
|
||||
2 Michael Frister
|
||||
2 Mark E. Haase
|
||||
2 Heikki Hannikainen
|
||||
1 meeee
|
||||
1 capt8bit
|
||||
1 Yuangxuan Wang
|
||||
1 Ulrich Petri
|
||||
1 Rune Halvorsen
|
||||
1 Rory McCann
|
||||
1 Henrik Nordström
|
||||
|
||||
12
README.mkd
@@ -49,13 +49,21 @@ Requirements
|
||||
------------
|
||||
|
||||
* [Python](http://www.python.org) 2.6.x or 2.7.x.
|
||||
* [openssl](http://www.openssl.org/) - installed by default on most systems.
|
||||
* [PyOpenSSL](http://pypi.python.org/pypi/pyOpenSSL) 0.12 or newer.
|
||||
* [pyasn1](http://pypi.python.org/pypi/pyasn1) 0.1.2 or newer.
|
||||
* [urwid](http://excess.org/urwid/) version 0.9.8 or newer.
|
||||
* [PIL](http://www.pythonware.com/products/pil/) version 1.1 or newer.
|
||||
* [lxml](http://lxml.de/) version 2.3 or newer.
|
||||
|
||||
The following auxiliary components may be needed if you plan to hack on
|
||||
mitmproxy:
|
||||
|
||||
* The test suite uses the [pry](http://github.com/cortesi/pry) unit testing
|
||||
library.
|
||||
* Rendering the documentation requires [countershape](http://github.com/cortesi/countershape).
|
||||
|
||||
__mitmproxy__ is tested and developed on OSX, Linux and OpenBSD.
|
||||
__mitmproxy__ is tested and developed on OSX, Linux and OpenBSD. Windows is not
|
||||
supported at the moment.
|
||||
|
||||
You should also make sure that your console environment is set up with the
|
||||
following:
|
||||
|
||||
20
README.txt
@@ -37,7 +37,7 @@ Community
|
||||
---------
|
||||
|
||||
Come join us in the #mitmproxy channel on the OFTC IRC network
|
||||
(irc://irc.oftc.net:6667).
|
||||
(irc.oftc.net, port 6667).
|
||||
|
||||
We also have a mailing list, hosted here:
|
||||
|
||||
@@ -48,13 +48,21 @@ Requirements
|
||||
------------
|
||||
|
||||
* Python_ 2.6.x or 2.7.x.
|
||||
* openssl_ - installed by default on most systems.
|
||||
* PyOpenSSL_ 0.12 or newer.
|
||||
* pyasn1_ 0.1.2 or newer.
|
||||
* urwid_ version 0.9.8 or newer.
|
||||
* PIL_ version 1.1 or newer.
|
||||
* lxml_ version 2.3 or newer.
|
||||
|
||||
The following auxiliary components may be needed if you plan to hack on
|
||||
mitmproxy:
|
||||
|
||||
* The test suite uses the pry_ unit testing
|
||||
library.
|
||||
* Rendering the documentation requires countershape_.
|
||||
|
||||
**mitmproxy** is tested and developed on OSX, Linux and OpenBSD.
|
||||
**mitmproxy** is tested and developed on OSX, Linux and OpenBSD. Windows is not
|
||||
supported at the moment.
|
||||
|
||||
You should also make sure that your console environment is set up with the
|
||||
following:
|
||||
@@ -67,8 +75,10 @@ following:
|
||||
.. _mitmproxy.org: http://mitmproxy.org
|
||||
.. _github.com/cortesi/mitmproxy: http://github.com/cortesi/mitmproxy
|
||||
.. _python: http://www.python.org
|
||||
.. _openssl: http://www.openssl.org/
|
||||
.. _PyOpenSSL: http://pypi.python.org/pypi/pyOpenSSL
|
||||
.. _pyasn1: http://pypi.python.org/pypi/pyasn1
|
||||
.. _PIL: http://www.pythonware.com/products/pil/
|
||||
.. _lxml: http://lxml.de/
|
||||
.. _urwid: http://excess.org/urwid/
|
||||
.. _pry: http://github.com/cortesi/pry
|
||||
.. _countershape: http://github.com/cortesi/countershape
|
||||
|
||||
|
||||
@@ -14,7 +14,7 @@
|
||||
</div>
|
||||
<!--(end)-->
|
||||
$!nav if this.title!="docs" else ""!$
|
||||
<h1><a href="@!urlTo("/index.html")!@">mitmproxy 0.7 docs</a></h1>
|
||||
<h1><a href="@!urlTo("/index.html")!@">mitmproxy 0.8 docs</a></h1>
|
||||
</div>
|
||||
<div id="bd">
|
||||
<div id="yui-main">
|
||||
|
||||
@@ -29,7 +29,7 @@
|
||||
</div>
|
||||
<!--(end)-->
|
||||
$!nav if this.title!="docs" else ""!$
|
||||
$!title if this.title!="docs" else "<h1>mitmproxy 0.7 docs</h1>"!$
|
||||
$!title if this.title!="docs" else "<h1>mitmproxy 0.8 docs</h1>"!$
|
||||
$!body!$
|
||||
</div>
|
||||
</div>
|
||||
|
||||
BIN
doc-src/certinstall/android-proxydroidinstall.png
Normal file
|
After Width: | Height: | Size: 122 KiB |
BIN
doc-src/certinstall/android-proxydroidsettings.png
Normal file
|
After Width: | Height: | Size: 53 KiB |
BIN
doc-src/certinstall/android-settingssecurityinstallca.png
Normal file
|
After Width: | Height: | Size: 56 KiB |
BIN
doc-src/certinstall/android-settingssecuritymenu.png
Normal file
|
After Width: | Height: | Size: 74 KiB |
BIN
doc-src/certinstall/android-settingssecurityuserinstalledca.png
Normal file
|
After Width: | Height: | Size: 46 KiB |
BIN
doc-src/certinstall/android-shellwgetmitmproxyca.png
Normal file
|
After Width: | Height: | Size: 22 KiB |
103
doc-src/certinstall/android.html
Normal file
@@ -0,0 +1,103 @@
|
||||
|
||||
The proxy situation on Android is [an
|
||||
embarrasment](http://code.google.com/p/android/issues/detail?id=1273). It's
|
||||
scarcely credible, but Android didn't have a global proxy setting at all until
|
||||
quite recently, and it's still not supported on many common Android versions.
|
||||
In the meantime the app ecosystem has grown used to life without this basic
|
||||
necessity, and many apps merrily ignore it even if it's there. The upshot is
|
||||
that in many cases the only way to make interception work on Android is to do
|
||||
it without relying on the proxy settings.
|
||||
|
||||
We used an Asus Transformer Prime TF201 with Android 4.0.3 in the examples
|
||||
below - your device may differ, but the broad process should be similar.
|
||||
|
||||
|
||||
Installing the mitmproxy certificate
|
||||
====================================
|
||||
|
||||
The first step is to install mitmproxy's interception certificate on the
|
||||
Android device. In your ~/.mitmproxy directory, there should be a file called
|
||||
__mitmproxy-ca-cert.cer__ - we need to transfer this file to
|
||||
__/sdcard/Downloads__ on the Android device. If this file doesn't exist for
|
||||
you, your certs were generated with an older version of mitmproxy - just copy
|
||||
the __mitmproxy-ca-cert.pem__ file to __mitmproxy-ca-cert.cer__ and proceed
|
||||
from there.
|
||||
|
||||
In this case, we're using wget from Better Terminal Emulator Pro to transfer
|
||||
the certificate from a local HTTP server. Other terminal applications may work,
|
||||
and you might also transfer the file via external media like an SDcard:
|
||||
|
||||
<img src="android-shellwgetmitmproxyca.png"/>
|
||||
|
||||
Once we have the certificate on the local disk, we need to import it into the
|
||||
list of trusted CAs. Go to Settings -> Security -> Credential Storage,
|
||||
and select "Install from storage":
|
||||
|
||||
<img src="android-settingssecuritymenu.png"/>
|
||||
|
||||
The certificate in /sdcard/Downloads is automatically located and offered for
|
||||
installation. Installing the cert will delete the download file from the local
|
||||
disk:
|
||||
|
||||
<img src="android-settingssecurityinstallca.png"/>
|
||||
|
||||
Afterwards, you should see the certificate listed in the Trusted Credentials
|
||||
store:
|
||||
|
||||
<img src="android-settingssecurityuserinstalledca.png"/>
|
||||
|
||||
If you're lucky enough to be working with an app that obeys the wireless proxy
|
||||
settings, you're just about done - simply configure the settings to point at
|
||||
mitmproxy. If not, proceed to the next step...
|
||||
|
||||
|
||||
Working around Android's proxy shortcomings
|
||||
===========================================
|
||||
|
||||
In response to Android's proxy situation, a number of apps have been created to
|
||||
duct-tape proxy support onto the OS. These tools work by running a rudimentary
|
||||
local proxy on the device, and forwarding all traffic destined for HTTP/S ports
|
||||
to it using iptables. Since the proxy is running locally, it can detect what
|
||||
the final IP address of the redirected traffic would have been. The local proxy
|
||||
then connects to a user-configured upstream proxy, and initiates a proxy
|
||||
CONNECT request to the destination IP.
|
||||
|
||||
Now, if the configured upstream proxy is mitmproxy, we have a slight problem.
|
||||
Proxy requests from the Android device in this scheme will specify only the
|
||||
destination IP address, __not__ the destination domain. Mitmproxy needs the
|
||||
target domain to generate a valid interception certificate. The solution is
|
||||
mitmproxy's [upstream certificate](@!urlTo("upstreamcerts.html")!@) option.
|
||||
When this is active, mitmproxy makes a connection to the upstream server to
|
||||
obtain the certificate Common Name and Subject Alternative Names.
|
||||
|
||||
Adding all this together, we can achieve reliable Android interception with
|
||||
only a few more minutes of setup. The instructions below show how to set up an
|
||||
Android device with
|
||||
[ProxyDroid](https://play.google.com/store/apps/details?id=org.proxydroid) (the
|
||||
local "duct-tape" proxy implementation) to achieve interception.
|
||||
|
||||
Install ProxyDroid
|
||||
------------------
|
||||
|
||||
First, root your device - this is required to install ProxyDroid. Then install
|
||||
ProxyDroid from the Google Play store:
|
||||
|
||||
<img src="android-proxydroidinstall.png"/>
|
||||
|
||||
You will be prompted for super-user access, which you must allow. Next, enter
|
||||
the ProxyDroid settings, and change the proxy settings to point to your
|
||||
mitmproxy instance. When you're done, it should look something like this:
|
||||
|
||||
<img src="android-proxydroidsettings.png"/>
|
||||
|
||||
In this case, our mitmproxy instance is at the host __maru.otago.ac.nz__,
|
||||
running on port __8080__.
|
||||
|
||||
When you start mitmproxy, make sure that the upstream certificate option is set
|
||||
(use the _--upstream-cert_ command-line option, or enable it interactively
|
||||
using the _o_ shortcut):
|
||||
|
||||
<pre class="terminal">
|
||||
mitmproxy --upstream-cert
|
||||
</pre>
|
||||
|
||||
@@ -5,4 +5,5 @@ pages = [
|
||||
Page("osx.html", "OSX"),
|
||||
Page("windows7.html", "Windows 7"),
|
||||
Page("ios.html", "IOS"),
|
||||
Page("android.html", "Android"),
|
||||
]
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
|
||||
<ul class="docindex">
|
||||
<li><a href="@!urlTo("intro.html")!@">Introduction</a></li>
|
||||
<li><a href="@!urlTo("install.html")!@">Installation</a></li>
|
||||
<li><a href="@!urlTo("mitmproxy.html")!@">mitmproxy</a></li>
|
||||
<li><a href="@!urlTo("mitmdump.html")!@">mitmdump</a></li>
|
||||
<li>Features</li>
|
||||
@@ -9,16 +10,19 @@
|
||||
<li><a href="@!urlTo("serverreplay.html")!@">Server-side replay</a></li>
|
||||
<li><a href="@!urlTo("sticky.html")!@">Sticky cookies and auth</a></li>
|
||||
<li><a href="@!urlTo("reverseproxy.html")!@">Reverse proxy mode</a></li>
|
||||
<li><a href="@!urlTo("upstreamcerts.html")!@">Upstream Certs</a></li>
|
||||
<li><a href="@!urlTo("replacements.html")!@">Replacements</a></li>
|
||||
<li><a href="@!urlTo("anticache.html")!@">Anticache</a></li>
|
||||
<li><a href="@!urlTo("filters.html")!@">Filter expressions</a></li>
|
||||
</ul>
|
||||
<li><a href="@!urlTo("scripts.html")!@">Scripts</a></li>
|
||||
<li><a href="@!urlTo("ssl.html")!@">SSL interception</a></li>
|
||||
<li><a href="@!urlTo("ssl.html")!@">Setting up SSL interception</a></li>
|
||||
<ul>
|
||||
<li><a href="@!urlTo("certinstall/firefox.html")!@">Firefox</a></li>
|
||||
<li><a href="@!urlTo("certinstall/osx.html")!@">OSX</a></li>
|
||||
<li><a href="@!urlTo("certinstall/windows7.html")!@">Windows 7</a></li>
|
||||
<li><a href="@!urlTo("certinstall/ios.html")!@">iPhone/iPad</a></li>
|
||||
<li><a href="@!urlTo("certinstall/android.html")!@">Android</a></li>
|
||||
</ul>
|
||||
<li><a href="@!urlTo("library.html")!@">libmproxy</a></li>
|
||||
<li>Tutorials</li>
|
||||
|
||||
@@ -68,16 +68,19 @@ ns.filt_help = filt_help
|
||||
pages = [
|
||||
Page("index.html", "docs"),
|
||||
Page("intro.html", "Introduction"),
|
||||
Page("install.html", "Installation"),
|
||||
Page("mitmproxy.html", "mitmproxy"),
|
||||
Page("mitmdump.html", "mitmdump"),
|
||||
Page("clientreplay.html", "Client-side replay"),
|
||||
Page("serverreplay.html", "Server-side replay"),
|
||||
Page("sticky.html", "Sticky cookies and auth"),
|
||||
Page("upstreamcerts.html", "Upstream Certs"),
|
||||
Page("replacements.html", "Replacements"),
|
||||
Page("reverseproxy.html", "Reverse proxy mode"),
|
||||
Page("anticache.html", "Anticache"),
|
||||
Page("filters.html", "Filter expressions"),
|
||||
Page("scripts.html", "Scripts"),
|
||||
Page("ssl.html", "SSL interception"),
|
||||
Page("ssl.html", "Setting up SSL interception"),
|
||||
Directory("certinstall"),
|
||||
Page("library.html", "libmproxy: mitmproxy as a library"),
|
||||
Directory("tutorials"),
|
||||
|
||||
57
doc-src/install.html
Normal file
@@ -0,0 +1,57 @@
|
||||
|
||||
## Using Pip
|
||||
|
||||
The preferred way to install mitmproxy is to use
|
||||
[pip](http://pypi.python.org/pypi/pip). A single command will install the
|
||||
latest release of mitmproxy, along with all its dependencies:
|
||||
|
||||
<pre>
|
||||
sudo pip install mitmproxy
|
||||
</pre>
|
||||
|
||||
|
||||
## From Source
|
||||
|
||||
- When installing from source, you will need to install the
|
||||
[dependencies](@!urlTo("intro.html")!@) by hand.
|
||||
- Then run the following command from the base of the source distribution:
|
||||
|
||||
<pre>
|
||||
sudo python setup.py install
|
||||
</pre>
|
||||
|
||||
|
||||
## OSX
|
||||
|
||||
- Make sure that XCode is installed from the App Store, and that the
|
||||
command-line tools have been downloaded (XCode/Preferences/Downloads).
|
||||
- Install __pip__ using the following command:
|
||||
|
||||
<pre>
|
||||
sudo easy_install pip
|
||||
</pre>
|
||||
|
||||
- Now use __pip__ to set up the dependencies and do the install:
|
||||
|
||||
<pre>
|
||||
sudo pip install mitmproxy
|
||||
</pre>
|
||||
|
||||
This procedure may vary if, for instance, you've installed Python from an
|
||||
external source like [homebrew](http://mxcl.github.com/homebrew/). In that
|
||||
case, the easiest way to proceed is to first install __easy_install__, then
|
||||
continue as above.
|
||||
|
||||
There are a few bits of customization you might want to do to make mitmproxy
|
||||
comfortable to use. At the moment, mitmproxy's color scheme is optimized for a
|
||||
dark background terminal, so you probably want to change the default. You can
|
||||
use the OSX <b>open</b> program to create a simple and effective
|
||||
<b>~/.mailcap</b> file to view HTTP bodies:
|
||||
|
||||
<pre>
|
||||
application/*; /usr/bin/open -Wn %s
|
||||
audio/*; /usr/bin/open -Wn %s
|
||||
image/*; /usr/bin/open -Wn %s
|
||||
video/*; /usr/bin/open -Wn %s
|
||||
</pre>
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
|
||||
__mitmdump__ is the command-line companion to mitmproxy. It provides
|
||||
tcpdump-like functionality to let you view, record, and programmatically
|
||||
transform HTTP traffic. See the _--help_ flag ouptut for complete
|
||||
transform HTTP traffic. See the _--help_ flag output for complete
|
||||
documentation.
|
||||
|
||||
|
||||
|
||||
@@ -16,7 +16,7 @@ The flow list shows an index of captured flows in chronological order.
|
||||
these flows, and then accept them (using the _a_ key) to continue. In this
|
||||
case, the request has been intercepted on the way to the server.
|
||||
- __5__: A response intercepted from the server on the way to the client.
|
||||
- __6__: The event log can be toggled on and off using the _e_ shorcut key. This
|
||||
- __6__: The event log can be toggled on and off using the _e_ shortcut key. This
|
||||
pane shows events and errors that may not result in a flow that shows up in the
|
||||
flow pane.
|
||||
- __7__: Flow count.
|
||||
@@ -45,19 +45,22 @@ which shows you exactly what's there without any changes. You can change modes
|
||||
using the _m_ key.
|
||||
|
||||
|
||||
## Key/Value Editor
|
||||
## Grid Editor
|
||||
|
||||
It turns out that ordered key/value data is pervasive in HTTP communications,
|
||||
so mitmproxy has a built-in editor to help edit and create this kind of data.
|
||||
There are three ways to reach the __K/V Editor__ from the __Flow View__ screen:
|
||||
Much of the data that we'd like to interact with in mitmproxy is structured.
|
||||
For instance, headers, queries and form data can all be thought of as a list of
|
||||
key/value pairs. Mitmproxy has a built-in editor that lays this type of data
|
||||
out in a grid for easy manipulation.
|
||||
|
||||
- Editing request or response headers (_e_ for edit, then _h_ for headers)
|
||||
- Editing a query string (_e_ for edit, then _q_ for query)
|
||||
- Editing a URL-encoded form (_e_ for edit, then _f_ for form)
|
||||
At the moment, the Grid Editor is used in four parts of mitmproxy:
|
||||
|
||||
If there is is no form or query string, an empty __K/V Editor__ will be started
|
||||
to let you add one. Here is the __K/V Editor__ showing the headers from a
|
||||
request:
|
||||
- Editing request or response headers (_e_ for edit, then _h_ for headers in flow view)
|
||||
- Editing a query string (_e_ for edit, then _q_ for query in flow view)
|
||||
- Editing a URL-encoded form (_e_ for edit, then _f_ for form in flow view)
|
||||
- Editing replacement patterns (_R_ globally)
|
||||
|
||||
If there is is no data, an empty editor will be started to let you add some.
|
||||
Here is the editor showing the headers from a request:
|
||||
|
||||
<img src="@!urlTo("screenshots/mitmproxy-kveditor.png")!@"/>
|
||||
|
||||
@@ -67,10 +70,10 @@ you are in edit mode for the specified field:
|
||||
|
||||
<img src="@!urlTo("screenshots/mitmproxy-kveditor-editmode.png")!@"/>
|
||||
|
||||
Modify the field as desired, and press escape or enter to exit edit mode when
|
||||
you're done. You can also add a key/value pair (_a_ key), delete a pair (_d_
|
||||
key), spawn an external editor on a field (_e_ key). Be sure to consult the
|
||||
context-sensitive help (_?_ key) for more.
|
||||
Modify the field as desired, then press escape to exit edit mode when you're
|
||||
done. You can also add a row (_a_ key), delete a row (_d_ key), spawn an
|
||||
external editor on a field (_e_ key). Be sure to consult the context-sensitive
|
||||
help (_?_ key) for more.
|
||||
|
||||
|
||||
# Example: Interception
|
||||
|
||||
61
doc-src/replacements.html
Normal file
@@ -0,0 +1,61 @@
|
||||
- command-line: _--replace_, _--replace-from-file_
|
||||
- mitmproxy shortcut: _R_
|
||||
|
||||
Mitmproxy lets you specify an arbitrary number of patterns that define text
|
||||
replacements within flows. Each pattern has 3 components: a filter that defines
|
||||
which flows a replacement applies to, a regular expression that defines what
|
||||
gets replaced, and a target value that defines what is substituted in.
|
||||
|
||||
Replace hooks fire when either a client request or a server response is
|
||||
received. Only the matching flow component is affected: so, for example, if a
|
||||
replace hook is triggered on server response, the replacement is only run on
|
||||
the Response object leaving the Request intact. You control whether the hook
|
||||
triggers on the request, response or both using the filter pattern. If you need
|
||||
finer-grained control than this, it's simple to create a script using the
|
||||
replacement API on Flow components.
|
||||
|
||||
Replacement hooks are extremely handy in interactive testing of applications.
|
||||
For instance you can use a replace hook to replace the text "XSS" with a
|
||||
complicated XSS exploit, and then "inject" the exploit simply by interacting
|
||||
with the application through the browser. When used with tools like Firebug and
|
||||
mitmproxy's own interception abilities, replacement hooks can be an amazingly
|
||||
flexible and powerful feature.
|
||||
|
||||
|
||||
## On the command-line
|
||||
|
||||
The replacement hook command-line options use a compact syntax to make it easy
|
||||
to specify all three components at once. The general form is as follows:
|
||||
|
||||
/patt/regex/replacement
|
||||
|
||||
Here, __patt__ is a mitmproxy filter expression, __regex__ is a valid Python
|
||||
regular expression, and __replacement__ is a string literal. The first
|
||||
character in the expression (__/__ in this case) defines what the separation
|
||||
character is. Here's an example of a valid expression that replaces "foo" with
|
||||
"bar" in all requests:
|
||||
|
||||
:~q:foo:bar
|
||||
|
||||
In practice, it's pretty common for the replacement literal to be long and
|
||||
complex. For instance, it might be an XSS exploit that weighs in at hundreds or
|
||||
thousands of characters. To cope with this, there's a variation of the
|
||||
replacement hook specifier that lets you load the replacement text from a file.
|
||||
So, you might start __mitmdump__ as follows:
|
||||
|
||||
<pre class="terminal">
|
||||
mitmdump --replace-from-file :~q:foo:~/xss-exploit
|
||||
</pre>
|
||||
|
||||
This will load the replacement text from the file __~/xss-exploit__.
|
||||
|
||||
Both the _--replace_ and _--replace-from-file_ flags can be passed multiple
|
||||
times.
|
||||
|
||||
|
||||
## Interactively
|
||||
|
||||
The _R_ shortcut key in mitmproxy lets you add and edit replacement hooks using
|
||||
a built-in editor. The context-sensitive help (_h_) has complete usage
|
||||
information.
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
|
||||
- command-line: _-R_ http[s]://hostname[:port]
|
||||
- mitmproxy shortcut: _R_
|
||||
- command-line: _-P_ http[s]://hostname[:port]
|
||||
- mitmproxy shortcut: _P_
|
||||
|
||||
In reverse proxy mode, mitmproxy acts as a standard HTTP server and forwards
|
||||
all requests to the specified upstream server. Note that the displayed URL for
|
||||
|
||||
@@ -42,7 +42,7 @@ guaranteed to have non-None __request__ and __response__ attributes.
|
||||
|
||||
### error(ScriptContext, Flow)
|
||||
|
||||
Called when a flow error has occured, e.g. invalid server responses, or
|
||||
Called when a flow error has occurred, e.g. invalid server responses, or
|
||||
interrupted connections. This is distinct from a valid server HTTP error
|
||||
response, which is simply a response with an HTTP error code. The __Flow__
|
||||
object is guaranteed to have non-None __request__ and __error__ attributes.
|
||||
@@ -87,7 +87,7 @@ The main classes you will deal with in writing mitmproxy scripts are:
|
||||
|
||||
<td>A dictionary-like object for managing sets of key/value data. There
|
||||
is also a variant called CaselessODict that ignores key case for some
|
||||
calls.</td>
|
||||
calls (used mainly for headers).</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th>libmproxy.flow.Response</th>
|
||||
@@ -99,10 +99,11 @@ The main classes you will deal with in writing mitmproxy scripts are:
|
||||
</tr>
|
||||
<tr>
|
||||
<th>libmproxy.flow.ScriptContext</th>
|
||||
|
||||
|
||||
<td> A handle for interacting with mitmproxy's from within scripts. </td>
|
||||
|
||||
</tr>
|
||||
<tr>
|
||||
<th>libmproxy.certutils.SSLCert</th>
|
||||
<td>Exposes information SSL certificates.</td>
|
||||
</tr>
|
||||
</table>
|
||||
|
||||
@@ -126,5 +127,3 @@ In this case, there are no client connections, and the events are run in the
|
||||
following order: __start__, __request__, __response__, __error__, __done__. If
|
||||
the flow doesn't have a __response__ or __error__ associated with it, the
|
||||
matching event will be skipped.
|
||||
|
||||
|
||||
|
||||
15
doc-src/upstreamcerts.html
Normal file
@@ -0,0 +1,15 @@
|
||||
- command-line: _--upstream-cert_
|
||||
- mitmproxy shortcut: _o_, then _u_
|
||||
|
||||
Normally, mitmproxy uses the target domain specified in a client's proxy
|
||||
request to generate an interception certificate. When __upstream-cert__ mode is
|
||||
activated a different procedure is followed: a connection is made to the
|
||||
specified remote server to retrieve its __Common Name__ and __Subject
|
||||
Alternative Names__. This feature is especially useful when the client
|
||||
specifies an IP address rather than a host name in the proxy request. If this
|
||||
is the case, we can only generate a certificate if we can establish the __CN__
|
||||
and __SANs__ from the upstream server.
|
||||
|
||||
Note that __upstream-cert__ mode does not work when the remote server relies on
|
||||
[Server Name Indication](http://en.wikipedia.org/wiki/Server_Name_Indication).
|
||||
Luckily, SNI is still not very widely used.
|
||||
221
libmproxy/certutils.py
Normal file
@@ -0,0 +1,221 @@
|
||||
import os, ssl, hashlib, socket, time, datetime
|
||||
from pyasn1.type import univ, constraint, char, namedtype, tag
|
||||
from pyasn1.codec.der.decoder import decode
|
||||
import OpenSSL
|
||||
|
||||
CERT_SLEEP_TIME = 1
|
||||
CERT_EXPIRY = str(365 * 3)
|
||||
|
||||
|
||||
def create_ca():
|
||||
key = OpenSSL.crypto.PKey()
|
||||
key.generate_key(OpenSSL.crypto.TYPE_RSA, 1024)
|
||||
ca = OpenSSL.crypto.X509()
|
||||
ca.set_serial_number(int(time.time()*10000))
|
||||
ca.set_version(2)
|
||||
ca.get_subject().CN = "mitmproxy"
|
||||
ca.get_subject().O = "mitmproxy"
|
||||
ca.gmtime_adj_notBefore(0)
|
||||
ca.gmtime_adj_notAfter(24 * 60 * 60 * 720)
|
||||
ca.set_issuer(ca.get_subject())
|
||||
ca.set_pubkey(key)
|
||||
ca.add_extensions([
|
||||
OpenSSL.crypto.X509Extension("basicConstraints", True,
|
||||
"CA:TRUE"),
|
||||
OpenSSL.crypto.X509Extension("nsCertType", True,
|
||||
"sslCA"),
|
||||
OpenSSL.crypto.X509Extension("extendedKeyUsage", True,
|
||||
"serverAuth,clientAuth,emailProtection,timeStamping,msCodeInd,msCodeCom,msCTLSign,msSGC,msEFS,nsSGC"
|
||||
),
|
||||
OpenSSL.crypto.X509Extension("keyUsage", False,
|
||||
"keyCertSign, cRLSign"),
|
||||
OpenSSL.crypto.X509Extension("subjectKeyIdentifier", False, "hash",
|
||||
subject=ca),
|
||||
])
|
||||
ca.sign(key, "sha1")
|
||||
return key, ca
|
||||
|
||||
|
||||
def dummy_ca(path):
|
||||
dirname = os.path.dirname(path)
|
||||
if not os.path.exists(dirname):
|
||||
os.makedirs(dirname)
|
||||
if path.endswith(".pem"):
|
||||
basename, _ = os.path.splitext(path)
|
||||
else:
|
||||
basename = path
|
||||
|
||||
key, ca = create_ca()
|
||||
|
||||
# Dump the CA plus private key
|
||||
f = open(path, "w")
|
||||
f.write(OpenSSL.crypto.dump_privatekey(OpenSSL.crypto.FILETYPE_PEM, key))
|
||||
f.write(OpenSSL.crypto.dump_certificate(OpenSSL.crypto.FILETYPE_PEM, ca))
|
||||
f.close()
|
||||
|
||||
# Dump the certificate in PEM format
|
||||
f = open(os.path.join(dirname, basename + "-cert.pem"), "w")
|
||||
f.write(OpenSSL.crypto.dump_certificate(OpenSSL.crypto.FILETYPE_PEM, ca))
|
||||
f.close()
|
||||
|
||||
# Create a .cer file with the same contents for Android
|
||||
f = open(os.path.join(dirname, basename + "-cert.cer"), "w")
|
||||
f.write(OpenSSL.crypto.dump_certificate(OpenSSL.crypto.FILETYPE_PEM, ca))
|
||||
f.close()
|
||||
|
||||
# Dump the certificate in PKCS12 format for Windows devices
|
||||
f = open(os.path.join(dirname, basename + "-cert.p12"), "w")
|
||||
p12 = OpenSSL.crypto.PKCS12()
|
||||
p12.set_certificate(ca)
|
||||
f.write(p12.export())
|
||||
f.close()
|
||||
return True
|
||||
|
||||
|
||||
def dummy_cert(certdir, ca, commonname, sans):
|
||||
"""
|
||||
certdir: Certificate directory.
|
||||
ca: Path to the certificate authority file, or None.
|
||||
commonname: Common name for the generated certificate.
|
||||
|
||||
Returns cert path if operation succeeded, None if not.
|
||||
"""
|
||||
namehash = hashlib.sha256(commonname).hexdigest()
|
||||
certpath = os.path.join(certdir, namehash + ".pem")
|
||||
if os.path.exists(certpath):
|
||||
return certpath
|
||||
|
||||
ss = []
|
||||
for i in sans:
|
||||
ss.append("DNS: %s"%i)
|
||||
ss = ", ".join(ss)
|
||||
|
||||
if ca:
|
||||
raw = file(ca, "r").read()
|
||||
ca = OpenSSL.crypto.load_certificate(OpenSSL.crypto.FILETYPE_PEM, raw)
|
||||
key = OpenSSL.crypto.load_privatekey(OpenSSL.crypto.FILETYPE_PEM, raw)
|
||||
else:
|
||||
key, ca = create_ca()
|
||||
|
||||
req = OpenSSL.crypto.X509Req()
|
||||
subj = req.get_subject()
|
||||
subj.CN = commonname
|
||||
req.set_pubkey(ca.get_pubkey())
|
||||
req.sign(key, "sha1")
|
||||
if ss:
|
||||
req.add_extensions([OpenSSL.crypto.X509Extension("subjectAltName", True, ss)])
|
||||
|
||||
cert = OpenSSL.crypto.X509()
|
||||
cert.gmtime_adj_notBefore(0)
|
||||
cert.gmtime_adj_notAfter(60 * 60 * 24 * 30)
|
||||
cert.set_issuer(ca.get_subject())
|
||||
cert.set_subject(req.get_subject())
|
||||
cert.set_serial_number(int(time.time()*10000))
|
||||
if ss:
|
||||
cert.add_extensions([OpenSSL.crypto.X509Extension("subjectAltName", True, ss)])
|
||||
cert.set_pubkey(req.get_pubkey())
|
||||
cert.sign(key, "sha1")
|
||||
|
||||
f = open(certpath, "w")
|
||||
f.write(OpenSSL.crypto.dump_certificate(OpenSSL.crypto.FILETYPE_PEM, cert))
|
||||
f.close()
|
||||
|
||||
return certpath
|
||||
|
||||
|
||||
class _GeneralName(univ.Choice):
|
||||
# We are only interested in dNSNames. We use a default handler to ignore
|
||||
# other types.
|
||||
componentType = namedtype.NamedTypes(
|
||||
namedtype.NamedType('dNSName', char.IA5String().subtype(
|
||||
implicitTag=tag.Tag(tag.tagClassContext, tag.tagFormatSimple, 2)
|
||||
)
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
class _GeneralNames(univ.SequenceOf):
|
||||
componentType = _GeneralName()
|
||||
sizeSpec = univ.SequenceOf.sizeSpec + constraint.ValueSizeConstraint(1, 1024)
|
||||
|
||||
|
||||
class SSLCert:
|
||||
def __init__(self, pemtxt):
|
||||
"""
|
||||
Returns a (common name, [subject alternative names]) tuple.
|
||||
"""
|
||||
self.cert = OpenSSL.crypto.load_certificate(OpenSSL.crypto.FILETYPE_PEM, pemtxt)
|
||||
|
||||
@classmethod
|
||||
def from_der(klass, der):
|
||||
pem = ssl.DER_cert_to_PEM_cert(der)
|
||||
return klass(pem)
|
||||
|
||||
def digest(self, name):
|
||||
return self.cert.digest(name)
|
||||
|
||||
@property
|
||||
def issuer(self):
|
||||
return self.cert.get_issuer().get_components()
|
||||
|
||||
@property
|
||||
def notbefore(self):
|
||||
t = self.cert.get_notBefore()
|
||||
return datetime.datetime.strptime(t, "%Y%m%d%H%M%SZ")
|
||||
|
||||
@property
|
||||
def notafter(self):
|
||||
t = self.cert.get_notAfter()
|
||||
return datetime.datetime.strptime(t, "%Y%m%d%H%M%SZ")
|
||||
|
||||
@property
|
||||
def has_expired(self):
|
||||
return self.cert.has_expired()
|
||||
|
||||
@property
|
||||
def subject(self):
|
||||
return self.cert.get_subject().get_components()
|
||||
|
||||
@property
|
||||
def serial(self):
|
||||
return self.cert.get_serial_number()
|
||||
|
||||
@property
|
||||
def keyinfo(self):
|
||||
pk = self.cert.get_pubkey()
|
||||
types = {
|
||||
OpenSSL.crypto.TYPE_RSA: "RSA",
|
||||
OpenSSL.crypto.TYPE_DSA: "DSA",
|
||||
}
|
||||
return (
|
||||
types.get(pk.type(), "UNKNOWN"),
|
||||
pk.bits()
|
||||
)
|
||||
|
||||
@property
|
||||
def cn(self):
|
||||
cn = None
|
||||
for i in self.subject:
|
||||
if i[0] == "CN":
|
||||
cn = i[1]
|
||||
return cn
|
||||
|
||||
@property
|
||||
def altnames(self):
|
||||
altnames = []
|
||||
for i in range(self.cert.get_extension_count()):
|
||||
ext = self.cert.get_extension(i)
|
||||
if ext.get_short_name() == "subjectAltName":
|
||||
dec = decode(ext.get_data(), asn1Spec=_GeneralNames())
|
||||
for i in dec[0]:
|
||||
altnames.append(i[0].asOctets())
|
||||
return altnames
|
||||
|
||||
|
||||
# begin nocover
|
||||
def get_remote_cert(host, port):
|
||||
addr = socket.gethostbyname(host)
|
||||
s = ssl.get_server_certificate((addr, port))
|
||||
return SSLCert(s)
|
||||
# end nocover
|
||||
|
||||
@@ -14,7 +14,61 @@
|
||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import proxy
|
||||
import optparse
|
||||
import optparse, re, filt
|
||||
|
||||
|
||||
class ParseReplaceException(Exception): pass
|
||||
class OptionException(Exception): pass
|
||||
|
||||
|
||||
def parse_replace_hook(s):
|
||||
"""
|
||||
Returns a (pattern, regex, replacement) tuple.
|
||||
|
||||
The general form for a replacement hook is as follows:
|
||||
|
||||
/patt/regex/replacement
|
||||
|
||||
The first character specifies the separator. Example:
|
||||
|
||||
:~q:foo:bar
|
||||
|
||||
If only two clauses are specified, the pattern is set to match
|
||||
universally (i.e. ".*"). Example:
|
||||
|
||||
/foo/bar/
|
||||
|
||||
Clauses are parsed from left to right. Extra separators are taken to be
|
||||
part of the final clause. For instance, the replacement clause below is
|
||||
"foo/bar/":
|
||||
|
||||
/one/two/foo/bar/
|
||||
|
||||
Checks that pattern and regex are both well-formed. Raises
|
||||
ParseReplaceException on error.
|
||||
"""
|
||||
sep, rem = s[0], s[1:]
|
||||
parts = rem.split(sep, 2)
|
||||
if len(parts) == 2:
|
||||
patt = ".*"
|
||||
regex, replacement = parts
|
||||
elif len(parts) == 3:
|
||||
patt, regex, replacement = parts
|
||||
else:
|
||||
raise ParseReplaceException("Malformed replacement specifier - too few clauses: %s"%s)
|
||||
|
||||
if not regex:
|
||||
raise ParseReplaceException("Empty replacement regex: %s"%str(patt))
|
||||
|
||||
try:
|
||||
re.compile(regex)
|
||||
except re.error, e:
|
||||
raise ParseReplaceException("Malformed replacement regex: %s"%str(e.message))
|
||||
|
||||
if not filt.parse(patt):
|
||||
raise ParseReplaceException("Malformed replacement filter pattern: %s"%patt)
|
||||
|
||||
return patt, regex, replacement
|
||||
|
||||
|
||||
def get_common_options(options):
|
||||
@@ -29,6 +83,24 @@ def get_common_options(options):
|
||||
elif options.stickyauth_filt:
|
||||
stickyauth = options.stickyauth_filt
|
||||
|
||||
reps = []
|
||||
for i in options.replace:
|
||||
try:
|
||||
p = parse_replace_hook(i)
|
||||
except ParseReplaceException, e:
|
||||
raise OptionException(e.message)
|
||||
reps.append(p)
|
||||
for i in options.replace_file:
|
||||
try:
|
||||
patt, rex, path = parse_replace_hook(i)
|
||||
except ParseReplaceException, e:
|
||||
raise OptionException(e.message)
|
||||
try:
|
||||
v = open(path, "r").read()
|
||||
except IOError, e:
|
||||
raise OptionException("Could not read replace file: %s"%path)
|
||||
reps.append((patt, rex, v))
|
||||
|
||||
return dict(
|
||||
anticache = options.anticache,
|
||||
anticomp = options.anticomp,
|
||||
@@ -39,12 +111,14 @@ def get_common_options(options):
|
||||
refresh_server_playback = not options.norefresh,
|
||||
rheaders = options.rheaders,
|
||||
rfile = options.rfile,
|
||||
replacements = reps,
|
||||
server_replay = options.server_replay,
|
||||
script = options.script,
|
||||
stickycookie = stickycookie,
|
||||
stickyauth = stickyauth,
|
||||
wfile = options.wfile,
|
||||
verbosity = options.verbose,
|
||||
nopop = options.nopop,
|
||||
)
|
||||
|
||||
|
||||
@@ -80,7 +154,7 @@ def common_options(parser):
|
||||
help = "Proxy service port."
|
||||
)
|
||||
parser.add_option(
|
||||
"-R",
|
||||
"-P",
|
||||
action="store", dest="reverse_proxy", default=None,
|
||||
help="Reverse proxy to upstream server: http[s]://host[:port]"
|
||||
)
|
||||
@@ -141,6 +215,17 @@ def common_options(parser):
|
||||
help="Byte size limit of HTTP request and response bodies."\
|
||||
" Understands k/m/g suffixes, i.e. 3m for 3 megabytes."
|
||||
)
|
||||
parser.add_option(
|
||||
"--cert-wait-time", type="float",
|
||||
action="store", dest="cert_wait_time", default=0,
|
||||
help="Wait for specified number of seconds after a new cert is generated. This can smooth over small discrepancies between the client and server times."
|
||||
)
|
||||
parser.add_option(
|
||||
"--upstream-cert", default=False,
|
||||
action="store_true", dest="upstream_cert",
|
||||
help="Connect to upstream server to look up certificate details."
|
||||
)
|
||||
|
||||
group = optparse.OptionGroup(parser, "Client Replay")
|
||||
group.add_option(
|
||||
"-c",
|
||||
@@ -149,12 +234,6 @@ def common_options(parser):
|
||||
)
|
||||
parser.add_option_group(group)
|
||||
|
||||
parser.add_option(
|
||||
"--cert-wait-time", type="float",
|
||||
action="store", dest="cert_wait_time", default=0,
|
||||
help="Wait for specified number of seconds after a new cert is generated. This can smooth over small discrepancies between the client and server times."
|
||||
)
|
||||
|
||||
group = optparse.OptionGroup(parser, "Server Replay")
|
||||
group.add_option(
|
||||
"-S",
|
||||
@@ -178,6 +257,34 @@ def common_options(parser):
|
||||
help= "Disable response refresh, "
|
||||
"which updates times in cookies and headers for replayed responses."
|
||||
)
|
||||
group.add_option(
|
||||
"--no-pop",
|
||||
action="store_true", dest="nopop", default=False,
|
||||
help="Disable response pop from response flow. "
|
||||
"This makes it possible to replay same response multiple times."
|
||||
)
|
||||
|
||||
group = optparse.OptionGroup(
|
||||
parser,
|
||||
"Replacements",
|
||||
"""
|
||||
Replacements are of the form "/pattern/regex/replacement", where
|
||||
the separator can be any character. Please see the documentation
|
||||
for more information.
|
||||
""".strip()
|
||||
)
|
||||
group.add_option(
|
||||
"--replace",
|
||||
action="append", type="str", dest="replace", default=[],
|
||||
metavar="PATTERN",
|
||||
help="Replacement pattern."
|
||||
)
|
||||
group.add_option(
|
||||
"--replace-from-file",
|
||||
action="append", type="str", dest="replace_file", default=[],
|
||||
metavar="PATTERN",
|
||||
help="Replacement pattern, where the replacement clause is a path to a file."
|
||||
)
|
||||
parser.add_option_group(group)
|
||||
|
||||
proxy.certificate_option_group(parser)
|
||||
|
||||
@@ -14,10 +14,10 @@
|
||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import mailcap, mimetypes, tempfile, os, subprocess, glob, time, shlex
|
||||
import os.path, sys
|
||||
import os.path, sys, weakref
|
||||
import urwid
|
||||
from .. import controller, utils, flow, version
|
||||
import flowlist, flowview, help, common, kveditor, palettes
|
||||
from .. import controller, utils, flow
|
||||
import flowlist, flowview, help, common, grideditor, palettes, contentview, flowdetailview
|
||||
|
||||
EVENTLOG_SIZE = 500
|
||||
|
||||
@@ -25,7 +25,6 @@ EVENTLOG_SIZE = 500
|
||||
class Stop(Exception): pass
|
||||
|
||||
|
||||
#begin nocover
|
||||
|
||||
|
||||
class _PathCompleter:
|
||||
@@ -75,6 +74,7 @@ class _PathCompleter:
|
||||
self.final = ret[1]
|
||||
return ret[0]
|
||||
|
||||
#begin nocover
|
||||
|
||||
class PathEdit(urwid.Edit, _PathCompleter):
|
||||
def __init__(self, *args, **kwargs):
|
||||
@@ -102,6 +102,11 @@ class ActionBar(common.WWrap):
|
||||
self.w = PathEdit(prompt, text)
|
||||
|
||||
def prompt(self, prompt, text = ""):
|
||||
# A (partial) workaround for this Urwid issue:
|
||||
# https://github.com/Nic0/tyrs/issues/115
|
||||
# We can remove it once veryone is beyond 1.0.1
|
||||
if isinstance(prompt, basestring):
|
||||
prompt = unicode(prompt)
|
||||
self.w = urwid.Edit(prompt, text or "")
|
||||
|
||||
def message(self, message):
|
||||
@@ -119,6 +124,10 @@ class StatusBar(common.WWrap):
|
||||
def get_status(self):
|
||||
r = []
|
||||
|
||||
if self.master.replacehooks.count():
|
||||
r.append("[")
|
||||
r.append(("heading_key", "R"))
|
||||
r.append("eplacing]")
|
||||
if self.master.client_playback:
|
||||
r.append("[")
|
||||
r.append(("heading_key", "cplayback"))
|
||||
@@ -126,7 +135,10 @@ class StatusBar(common.WWrap):
|
||||
if self.master.server_playback:
|
||||
r.append("[")
|
||||
r.append(("heading_key", "splayback"))
|
||||
r.append(":%s to go]"%self.master.server_playback.count())
|
||||
if self.master.nopop:
|
||||
r.append(":%s in file]"%self.master.server_playback.count())
|
||||
else:
|
||||
r.append(":%s to go]"%self.master.server_playback.count())
|
||||
if self.master.state.intercept_txt:
|
||||
r.append("[")
|
||||
r.append(("heading_key", "i"))
|
||||
@@ -143,10 +155,14 @@ class StatusBar(common.WWrap):
|
||||
r.append("[")
|
||||
r.append(("heading_key", "u"))
|
||||
r.append(":%s]"%self.master.stickyauth_txt)
|
||||
if self.master.server and self.master.server.config.reverse_proxy:
|
||||
if self.master.server.config.reverse_proxy:
|
||||
r.append("[")
|
||||
r.append(("heading_key", "R"))
|
||||
r.append(("heading_key", "P"))
|
||||
r.append(":%s]"%utils.unparse_url(*self.master.server.config.reverse_proxy))
|
||||
if self.master.state.default_body_view != contentview.VIEW_AUTO:
|
||||
r.append("[")
|
||||
r.append(("heading_key", "M"))
|
||||
r.append(":%s]"%contentview.VIEW_NAMES[self.master.state.default_body_view])
|
||||
|
||||
opts = []
|
||||
if self.master.anticache:
|
||||
@@ -157,6 +173,8 @@ class StatusBar(common.WWrap):
|
||||
opts.append("norefresh")
|
||||
if self.master.killextra:
|
||||
opts.append("killextra")
|
||||
if self.master.server.config.upstream_cert:
|
||||
opts.append("upstream-cert")
|
||||
|
||||
if opts:
|
||||
r.append("[%s]"%(":".join(opts)))
|
||||
@@ -173,10 +191,18 @@ class StatusBar(common.WWrap):
|
||||
if self.expire and time.time() > self.expire:
|
||||
self.message("")
|
||||
|
||||
t = [
|
||||
('heading', ("[%s]"%self.master.state.flow_count()).ljust(7)),
|
||||
fc = self.master.state.flow_count()
|
||||
if self.master.currentflow:
|
||||
idx = self.master.state.view.index(self.master.currentflow) + 1
|
||||
t = [
|
||||
('heading', ("[%s/%s]"%(idx, fc)).ljust(9))
|
||||
]
|
||||
if self.master.server:
|
||||
else:
|
||||
t = [
|
||||
('heading', ("[%s]"%fc).ljust(9))
|
||||
]
|
||||
|
||||
if self.master.server.bound:
|
||||
boundaddr = "[%s:%s]"%(self.master.server.address or "*", self.master.server.port)
|
||||
else:
|
||||
boundaddr = ""
|
||||
@@ -216,6 +242,7 @@ class StatusBar(common.WWrap):
|
||||
else:
|
||||
self.expire = None
|
||||
self.ab.message(msg)
|
||||
self.master.drawscreen()
|
||||
|
||||
|
||||
#end nocover
|
||||
@@ -224,10 +251,19 @@ class ConsoleState(flow.State):
|
||||
def __init__(self):
|
||||
flow.State.__init__(self)
|
||||
self.focus = None
|
||||
self.view_body_mode = common.VIEW_BODY_PRETTY
|
||||
self.default_body_view = contentview.VIEW_AUTO
|
||||
self.view_flow_mode = common.VIEW_FLOW_REQUEST
|
||||
self.last_script = ""
|
||||
self.last_saveload = ""
|
||||
self.flowsettings = weakref.WeakKeyDictionary()
|
||||
|
||||
def add_flow_setting(self, flow, key, value):
|
||||
d = self.flowsettings.setdefault(flow, {})
|
||||
d[key] = value
|
||||
|
||||
def get_flow_setting(self, flow, key, default=None):
|
||||
d = self.flowsettings.get(flow, {})
|
||||
return d.get(key, default)
|
||||
|
||||
def add_request(self, req):
|
||||
f = flow.State.add_request(self, req)
|
||||
@@ -291,12 +327,14 @@ class Options(object):
|
||||
"refresh_server_playback",
|
||||
"rfile",
|
||||
"script",
|
||||
"replacements",
|
||||
"rheaders",
|
||||
"server_replay",
|
||||
"stickycookie",
|
||||
"stickyauth",
|
||||
"verbosity",
|
||||
"wfile",
|
||||
"nopop",
|
||||
]
|
||||
def __init__(self, **kwargs):
|
||||
for k, v in kwargs.items():
|
||||
@@ -311,23 +349,15 @@ class Options(object):
|
||||
|
||||
class ConsoleMaster(flow.FlowMaster):
|
||||
palette = []
|
||||
footer_text_default = [
|
||||
('heading_key', "?"), ":help ",
|
||||
]
|
||||
footer_text_help = [
|
||||
("heading", 'mitmproxy v%s '%version.VERSION),
|
||||
('heading_key', "q"), ":back ",
|
||||
]
|
||||
footer_text_flowview = [
|
||||
('heading_key', "?"), ":help ",
|
||||
('heading_key', "q"), ":back ",
|
||||
]
|
||||
def __init__(self, server, options):
|
||||
flow.FlowMaster.__init__(self, server, ConsoleState())
|
||||
self.looptime = 0
|
||||
self.options = options
|
||||
|
||||
self.flow_list_view = None
|
||||
for i in options.replacements:
|
||||
self.replacehooks.add(*i)
|
||||
|
||||
self.flow_list_walker = None
|
||||
self.set_palette()
|
||||
|
||||
r = self.set_intercept(options.intercept)
|
||||
@@ -350,6 +380,7 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
self.anticomp = options.anticomp
|
||||
self.killextra = options.kill
|
||||
self.rheaders = options.rheaders
|
||||
self.nopop = options.nopop
|
||||
|
||||
self.eventlog = options.eventlog
|
||||
self.eventlist = urwid.SimpleListWalker([])
|
||||
@@ -422,7 +453,7 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
self.start_server_playback(
|
||||
ret,
|
||||
self.killextra, self.rheaders,
|
||||
False
|
||||
False, self.nopop
|
||||
)
|
||||
|
||||
def spawn_editor(self, data):
|
||||
@@ -482,7 +513,7 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
self.ui = urwid.raw_display.Screen()
|
||||
self.ui.set_terminal_properties(256)
|
||||
self.ui.register_palette(self.palette)
|
||||
self.flow_list_view = flowlist.ConnectionListView(self, self.state)
|
||||
self.flow_list_walker = flowlist.FlowListWalker(self, self.state)
|
||||
|
||||
self.view = None
|
||||
self.statusbar = None
|
||||
@@ -495,9 +526,7 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
|
||||
self.view_flowlist()
|
||||
|
||||
if self.server:
|
||||
slave = controller.Slave(self.masterq, self.server)
|
||||
slave.start()
|
||||
self.server.start_slave(controller.Slave, self.masterq)
|
||||
|
||||
if self.options.rfile:
|
||||
ret = self.load_flows(self.options.rfile)
|
||||
@@ -515,7 +544,7 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
def focus_current(self):
|
||||
if self.currentflow:
|
||||
try:
|
||||
self.flow_list_view.set_focus(self.state.index(self.currentflow))
|
||||
self.flow_list_walker.set_focus(self.state.index(self.currentflow))
|
||||
except (IndexError, ValueError):
|
||||
pass
|
||||
|
||||
@@ -529,16 +558,23 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
|
||||
def view_help(self):
|
||||
h = help.HelpView(self, self.help_context, (self.statusbar, self.body, self.header))
|
||||
self.statusbar = StatusBar(self, self.footer_text_help)
|
||||
self.statusbar = StatusBar(self, help.footer)
|
||||
self.body = h
|
||||
self.header = None
|
||||
self.make_view()
|
||||
|
||||
def view_kveditor(self, title, value, callback, *args, **kwargs):
|
||||
self.body = kveditor.KVEditor(self, title, value, callback, *args, **kwargs)
|
||||
def view_flowdetails(self, flow):
|
||||
h = flowdetailview.FlowDetailsView(self, flow, (self.statusbar, self.body, self.header))
|
||||
self.statusbar = StatusBar(self, flowdetailview.footer)
|
||||
self.body = h
|
||||
self.header = None
|
||||
self.help_context = kveditor.help_context
|
||||
self.statusbar = StatusBar(self, self.footer_text_help)
|
||||
self.make_view()
|
||||
|
||||
def view_grideditor(self, ge):
|
||||
self.body = ge
|
||||
self.header = None
|
||||
self.help_context = grideditor.help_context
|
||||
self.statusbar = StatusBar(self, grideditor.footer)
|
||||
self.make_view()
|
||||
|
||||
def view_flowlist(self):
|
||||
@@ -548,8 +584,8 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
if self.eventlog:
|
||||
self.body = flowlist.BodyPile(self)
|
||||
else:
|
||||
self.body = flowlist.ConnectionListBox(self)
|
||||
self.statusbar = StatusBar(self, self.footer_text_default)
|
||||
self.body = flowlist.FlowListBox(self)
|
||||
self.statusbar = StatusBar(self, flowlist.footer)
|
||||
self.header = None
|
||||
self.currentflow = None
|
||||
|
||||
@@ -557,9 +593,9 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
self.help_context = flowlist.help_context
|
||||
|
||||
def view_flow(self, flow):
|
||||
self.body = flowview.ConnectionView(self, self.state, flow)
|
||||
self.header = flowview.ConnectionViewHeader(self, flow)
|
||||
self.statusbar = StatusBar(self, self.footer_text_flowview)
|
||||
self.body = flowview.FlowView(self, self.state, flow)
|
||||
self.header = flowview.FlowViewHeader(self, flow)
|
||||
self.statusbar = StatusBar(self, flowview.footer)
|
||||
self.currentflow = flow
|
||||
|
||||
self.make_view()
|
||||
@@ -604,7 +640,7 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
except flow.FlowReadError, v:
|
||||
return v.strerror
|
||||
f.close()
|
||||
if self.flow_list_view:
|
||||
if self.flow_list_walker:
|
||||
self.sync_list_view()
|
||||
self.focus_current()
|
||||
|
||||
@@ -661,11 +697,19 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
self.state.accept_all()
|
||||
|
||||
def set_limit(self, txt):
|
||||
return self.state.set_limit(txt)
|
||||
v = self.state.set_limit(txt)
|
||||
self.sync_list_view()
|
||||
return v
|
||||
|
||||
def set_intercept(self, txt):
|
||||
return self.state.set_intercept(txt)
|
||||
|
||||
def change_default_display_mode(self, t):
|
||||
v = contentview.VIEW_SHORTCUTS.get(t)
|
||||
self.state.default_body_view = v
|
||||
if self.currentflow:
|
||||
self.refresh_flow(self.currentflow)
|
||||
|
||||
def set_reverse_proxy(self, txt):
|
||||
if not txt:
|
||||
self.server.config.reverse_proxy = None
|
||||
@@ -675,15 +719,6 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
return "Invalid reverse proxy specification"
|
||||
self.server.config.reverse_proxy = s
|
||||
|
||||
def changeview(self, v):
|
||||
if v == "r":
|
||||
self.state.view_body_mode = common.VIEW_BODY_RAW
|
||||
elif v == "h":
|
||||
self.state.view_body_mode = common.VIEW_BODY_HEX
|
||||
elif v == "p":
|
||||
self.state.view_body_mode = common.VIEW_BODY_PRETTY
|
||||
self.refresh_flow(self.currentflow)
|
||||
|
||||
def drawscreen(self):
|
||||
size = self.ui.get_cols_rows()
|
||||
canvas = self.view.render(size, focus=1)
|
||||
@@ -696,6 +731,11 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
else:
|
||||
self.view_flowlist()
|
||||
|
||||
def set_replace(self, r):
|
||||
self.replacehooks.clear()
|
||||
for i in r:
|
||||
self.replacehooks.add(*i)
|
||||
|
||||
def loop(self):
|
||||
changed = True
|
||||
try:
|
||||
@@ -750,7 +790,6 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
self.state.intercept_txt,
|
||||
self.set_intercept
|
||||
)
|
||||
self.sync_list_view()
|
||||
elif k == "Q":
|
||||
raise Stop
|
||||
elif k == "q":
|
||||
@@ -762,7 +801,13 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
),
|
||||
self.quit,
|
||||
)
|
||||
elif k == "R":
|
||||
elif k == "M":
|
||||
self.prompt_onekey(
|
||||
"Global default display mode",
|
||||
contentview.VIEW_PROMPT,
|
||||
self.change_default_display_mode
|
||||
)
|
||||
elif k == "P":
|
||||
if self.server.config.reverse_proxy:
|
||||
p = utils.unparse_url(*self.server.config.reverse_proxy)
|
||||
else:
|
||||
@@ -772,7 +817,14 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
p,
|
||||
self.set_reverse_proxy
|
||||
)
|
||||
self.sync_list_view()
|
||||
elif k == "R":
|
||||
self.view_grideditor(
|
||||
grideditor.ReplaceEditor(
|
||||
self,
|
||||
self.replacehooks.get_specs(),
|
||||
self.set_replace
|
||||
)
|
||||
)
|
||||
elif k == "s":
|
||||
if self.script:
|
||||
self.load_script(None)
|
||||
@@ -806,6 +858,7 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
("anticomp", "c"),
|
||||
("killextra", "k"),
|
||||
("norefresh", "n"),
|
||||
("upstream-certs", "u"),
|
||||
),
|
||||
self._change_options
|
||||
)
|
||||
@@ -846,13 +899,15 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
self.killextra = not self.killextra
|
||||
elif a == "n":
|
||||
self.refresh_server_playback = not self.refresh_server_playback
|
||||
elif a == "u":
|
||||
self.server.config.upstream_cert = not self.server.config.upstream_cert
|
||||
|
||||
def shutdown(self):
|
||||
self.state.killall(self)
|
||||
controller.Master.shutdown(self)
|
||||
|
||||
def sync_list_view(self):
|
||||
self.flow_list_view._modified()
|
||||
self.flow_list_walker._modified()
|
||||
|
||||
def clear_flows(self):
|
||||
self.state.clear()
|
||||
@@ -910,4 +965,3 @@ class ConsoleMaster(flow.FlowMaster):
|
||||
if f:
|
||||
self.process_flow(f, r)
|
||||
return f
|
||||
|
||||
|
||||
@@ -18,15 +18,6 @@ import urwid.util
|
||||
from .. import utils
|
||||
|
||||
|
||||
VIEW_BODY_RAW = 0
|
||||
VIEW_BODY_HEX = 1
|
||||
VIEW_BODY_PRETTY = 2
|
||||
|
||||
BODY_VIEWS = {
|
||||
VIEW_BODY_RAW: "raw",
|
||||
VIEW_BODY_HEX: "hex",
|
||||
VIEW_BODY_PRETTY: "pretty"
|
||||
}
|
||||
|
||||
VIEW_FLOW_REQUEST = 0
|
||||
VIEW_FLOW_RESPONSE = 1
|
||||
@@ -70,8 +61,8 @@ def format_keyvals(lst, key="key", val="text", indent=0):
|
||||
maxk,
|
||||
urwid.Text([(key, kv[0] or "")])
|
||||
),
|
||||
urwid.Text([(val, kv[1])])
|
||||
])
|
||||
kv[1] if isinstance(kv[1], urwid.Widget) else urwid.Text([(val, kv[1])])
|
||||
])
|
||||
ret.append(urwid.Columns(cols, dividechars = 2))
|
||||
return ret
|
||||
|
||||
@@ -146,17 +137,17 @@ def raw_format_flow(f, focus, extended, padding):
|
||||
)
|
||||
|
||||
if f["resp_code"]:
|
||||
if f["resp_code"] in [200, 304]:
|
||||
resp.append(fcol(SYMBOL_RETURN, "goodcode"))
|
||||
else:
|
||||
resp.append(fcol(SYMBOL_RETURN, "error"))
|
||||
codes = {
|
||||
2: "code_200",
|
||||
3: "code_300",
|
||||
4: "code_400",
|
||||
5: "code_500",
|
||||
}
|
||||
ccol = codes.get(f["resp_code"]/100, "code_other")
|
||||
resp.append(fcol(SYMBOL_RETURN, ccol))
|
||||
if f["resp_is_replay"]:
|
||||
resp.append(fcol(SYMBOL_REPLAY, "replay"))
|
||||
if f["resp_code"] in [200, 304]:
|
||||
resp.append(fcol(f["resp_code"], "goodcode"))
|
||||
else:
|
||||
resp.append(fcol(f["resp_code"], "error"))
|
||||
|
||||
resp.append(fcol(f["resp_code"], ccol))
|
||||
if f["intercepting"] and f["resp_code"] and not f["resp_acked"]:
|
||||
rc = "intercept"
|
||||
else:
|
||||
|
||||
331
libmproxy/console/contentview.py
Normal file
@@ -0,0 +1,331 @@
|
||||
import re, cStringIO
|
||||
import urwid
|
||||
from PIL import Image
|
||||
from PIL.ExifTags import TAGS
|
||||
import lxml.html, lxml.etree
|
||||
import common
|
||||
from .. import utils, encoding, flow
|
||||
from ..contrib import jsbeautifier
|
||||
|
||||
VIEW_CUTOFF = 1024*50
|
||||
|
||||
VIEW_AUTO = 0
|
||||
VIEW_JSON = 1
|
||||
VIEW_XML = 2
|
||||
VIEW_URLENCODED = 3
|
||||
VIEW_MULTIPART = 4
|
||||
VIEW_JAVASCRIPT = 5
|
||||
VIEW_IMAGE = 6
|
||||
VIEW_RAW = 7
|
||||
VIEW_HEX = 8
|
||||
VIEW_HTML = 9
|
||||
|
||||
VIEW_NAMES = {
|
||||
VIEW_AUTO: "Auto",
|
||||
VIEW_JSON: "JSON",
|
||||
VIEW_XML: "XML",
|
||||
VIEW_URLENCODED: "URL-encoded",
|
||||
VIEW_MULTIPART: "Multipart Form",
|
||||
VIEW_JAVASCRIPT: "JavaScript",
|
||||
VIEW_IMAGE: "Image",
|
||||
VIEW_RAW: "Raw",
|
||||
VIEW_HEX: "Hex",
|
||||
VIEW_HTML: "HTML",
|
||||
}
|
||||
|
||||
|
||||
VIEW_PROMPT = (
|
||||
("auto detect", "a"),
|
||||
("hex", "e"),
|
||||
("html", "h"),
|
||||
("image", "i"),
|
||||
("javascript", "j"),
|
||||
("json", "s"),
|
||||
("raw", "r"),
|
||||
("multipart", "m"),
|
||||
("urlencoded", "u"),
|
||||
("xml", "x"),
|
||||
)
|
||||
|
||||
VIEW_SHORTCUTS = {
|
||||
"a": VIEW_AUTO,
|
||||
"x": VIEW_XML,
|
||||
"h": VIEW_HTML,
|
||||
"i": VIEW_IMAGE,
|
||||
"j": VIEW_JAVASCRIPT,
|
||||
"s": VIEW_JSON,
|
||||
"u": VIEW_URLENCODED,
|
||||
"m": VIEW_MULTIPART,
|
||||
"r": VIEW_RAW,
|
||||
"e": VIEW_HEX,
|
||||
}
|
||||
|
||||
CONTENT_TYPES_MAP = {
|
||||
"text/html": VIEW_HTML,
|
||||
"application/json": VIEW_JSON,
|
||||
"text/xml": VIEW_XML,
|
||||
"multipart/form-data": VIEW_MULTIPART,
|
||||
"application/x-www-form-urlencoded": VIEW_URLENCODED,
|
||||
"application/x-javascript": VIEW_JAVASCRIPT,
|
||||
"application/javascript": VIEW_JAVASCRIPT,
|
||||
"text/javascript": VIEW_JAVASCRIPT,
|
||||
"image/png": VIEW_IMAGE,
|
||||
"image/jpeg": VIEW_IMAGE,
|
||||
"image/gif": VIEW_IMAGE,
|
||||
"image/vnd.microsoft.icon": VIEW_IMAGE,
|
||||
"image/x-icon": VIEW_IMAGE,
|
||||
}
|
||||
|
||||
def trailer(clen, txt, limit):
|
||||
rem = clen - limit
|
||||
if rem > 0:
|
||||
txt.append(urwid.Text(""))
|
||||
txt.append(
|
||||
urwid.Text(
|
||||
[
|
||||
("highlight", "... %s of data not shown. Press "%utils.pretty_size(rem)),
|
||||
("key", "f"),
|
||||
("highlight", " to load all data.")
|
||||
]
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
def _view_text(content, total, limit):
|
||||
"""
|
||||
Generates a body for a chunk of text.
|
||||
"""
|
||||
txt = []
|
||||
for i in utils.cleanBin(content).splitlines():
|
||||
txt.append(
|
||||
urwid.Text(("text", i), wrap="any")
|
||||
)
|
||||
trailer(total, txt, limit)
|
||||
return txt
|
||||
|
||||
|
||||
def view_raw(hdrs, content, limit):
|
||||
txt = _view_text(content[:limit], len(content), limit)
|
||||
return "Raw", txt
|
||||
|
||||
|
||||
def view_hex(hdrs, content, limit):
|
||||
txt = []
|
||||
for offset, hexa, s in utils.hexdump(content[:limit]):
|
||||
txt.append(urwid.Text([
|
||||
("offset", offset),
|
||||
" ",
|
||||
("text", hexa),
|
||||
" ",
|
||||
("text", s),
|
||||
]))
|
||||
trailer(len(content), txt, limit)
|
||||
return "Hex", txt
|
||||
|
||||
|
||||
def view_xml(hdrs, content, limit):
|
||||
parser = lxml.etree.XMLParser(remove_blank_text=True, resolve_entities=False, strip_cdata=False, recover=False)
|
||||
try:
|
||||
document = lxml.etree.fromstring(content, parser)
|
||||
except lxml.etree.XMLSyntaxError:
|
||||
return None
|
||||
docinfo = document.getroottree().docinfo
|
||||
|
||||
prev = []
|
||||
p = document.getroottree().getroot().getprevious()
|
||||
while p is not None:
|
||||
prev.insert(
|
||||
0,
|
||||
lxml.etree.tostring(p)
|
||||
)
|
||||
p = p.getprevious()
|
||||
doctype=docinfo.doctype
|
||||
if prev:
|
||||
doctype += "\n".join(prev).strip()
|
||||
doctype = doctype.strip()
|
||||
|
||||
s = lxml.etree.tostring(
|
||||
document,
|
||||
pretty_print=True,
|
||||
xml_declaration=True,
|
||||
doctype=doctype or None,
|
||||
encoding = docinfo.encoding
|
||||
)
|
||||
|
||||
txt = []
|
||||
for i in s[:limit].strip().split("\n"):
|
||||
txt.append(
|
||||
urwid.Text(("text", i)),
|
||||
)
|
||||
trailer(len(content), txt, limit)
|
||||
return "XML-like data", txt
|
||||
|
||||
|
||||
def view_html(hdrs, content, limit):
|
||||
if utils.isXML(content):
|
||||
parser = lxml.etree.HTMLParser(strip_cdata=True, remove_blank_text=True)
|
||||
d = lxml.html.fromstring(content, parser=parser)
|
||||
docinfo = d.getroottree().docinfo
|
||||
s = lxml.etree.tostring(d, pretty_print=True, doctype=docinfo.doctype)
|
||||
return "HTML", _view_text(s[:limit], len(s), limit)
|
||||
|
||||
|
||||
def view_json(hdrs, content, limit):
|
||||
lines = utils.pretty_json(content)
|
||||
if lines:
|
||||
txt = []
|
||||
sofar = 0
|
||||
for i in lines:
|
||||
sofar += len(i)
|
||||
txt.append(
|
||||
urwid.Text(("text", i)),
|
||||
)
|
||||
if sofar > limit:
|
||||
break
|
||||
trailer(sum(len(i) for i in lines), txt, limit)
|
||||
return "JSON", txt
|
||||
|
||||
|
||||
def view_multipart(hdrs, content, limit):
|
||||
v = hdrs.get("content-type")
|
||||
if v:
|
||||
v = utils.parse_content_type(v[0])
|
||||
if not v:
|
||||
return
|
||||
boundary = v[2].get("boundary")
|
||||
if not boundary:
|
||||
return
|
||||
|
||||
rx = re.compile(r'\bname="([^"]+)"')
|
||||
keys = []
|
||||
vals = []
|
||||
|
||||
for i in content.split("--" + boundary):
|
||||
parts = i.splitlines()
|
||||
if len(parts) > 1 and parts[0][0:2] != "--":
|
||||
match = rx.search(parts[1])
|
||||
if match:
|
||||
keys.append(match.group(1) + ":")
|
||||
vals.append(utils.cleanBin(
|
||||
"\n".join(parts[3+parts[2:].index(""):])
|
||||
))
|
||||
r = [
|
||||
urwid.Text(("highlight", "Form data:\n")),
|
||||
]
|
||||
r.extend(common.format_keyvals(
|
||||
zip(keys, vals),
|
||||
key = "header",
|
||||
val = "text"
|
||||
))
|
||||
return "Multipart form", r
|
||||
|
||||
|
||||
def view_urlencoded(hdrs, content, limit):
|
||||
lines = utils.urldecode(content)
|
||||
if lines:
|
||||
body = common.format_keyvals(
|
||||
[(k+":", v) for (k, v) in lines],
|
||||
key = "header",
|
||||
val = "text"
|
||||
)
|
||||
return "URLEncoded form", body
|
||||
|
||||
|
||||
def view_javascript(hdrs, content, limit):
|
||||
opts = jsbeautifier.default_options()
|
||||
opts.indent_size = 2
|
||||
try:
|
||||
res = jsbeautifier.beautify(content[:limit], opts)
|
||||
# begin nocover
|
||||
except:
|
||||
# Bugs in jsbeautifier mean that it can trhow arbitrary errors.
|
||||
return None
|
||||
# end nocover
|
||||
return "JavaScript", _view_text(res, len(content), limit)
|
||||
|
||||
|
||||
def view_image(hdrs, content, limit):
|
||||
try:
|
||||
img = Image.open(cStringIO.StringIO(content))
|
||||
except IOError:
|
||||
return None
|
||||
parts = [
|
||||
("Format", str(img.format_description)),
|
||||
("Size", "%s x %s px"%img.size),
|
||||
("Mode", str(img.mode)),
|
||||
]
|
||||
for i in sorted(img.info.keys()):
|
||||
if i != "exif":
|
||||
parts.append(
|
||||
(str(i), str(img.info[i]))
|
||||
)
|
||||
if hasattr(img, "_getexif"):
|
||||
ex = img._getexif()
|
||||
if ex:
|
||||
for i in sorted(ex.keys()):
|
||||
tag = TAGS.get(i, i)
|
||||
parts.append(
|
||||
(str(tag), str(ex[i]))
|
||||
)
|
||||
clean = []
|
||||
for i in parts:
|
||||
clean.append([utils.cleanBin(i[0]), utils.cleanBin(i[1])])
|
||||
fmt = common.format_keyvals(
|
||||
clean,
|
||||
key = "header",
|
||||
val = "text"
|
||||
)
|
||||
return "%s image"%img.format, fmt
|
||||
|
||||
|
||||
PRETTY_FUNCTION_MAP = {
|
||||
VIEW_XML: view_xml,
|
||||
VIEW_HTML: view_html,
|
||||
VIEW_JSON: view_json,
|
||||
VIEW_URLENCODED: view_urlencoded,
|
||||
VIEW_MULTIPART: view_multipart,
|
||||
VIEW_JAVASCRIPT: view_javascript,
|
||||
VIEW_IMAGE: view_image,
|
||||
VIEW_HEX: view_hex,
|
||||
VIEW_RAW: view_raw,
|
||||
}
|
||||
|
||||
def get_view_func(viewmode, hdrs, content):
|
||||
"""
|
||||
Returns a function object.
|
||||
"""
|
||||
if viewmode == VIEW_AUTO:
|
||||
ctype = hdrs.get("content-type")
|
||||
if ctype:
|
||||
ctype = ctype[0]
|
||||
ct = utils.parse_content_type(ctype) if ctype else None
|
||||
if ct:
|
||||
viewmode = CONTENT_TYPES_MAP.get("%s/%s"%(ct[0], ct[1]))
|
||||
if not viewmode and utils.isXML(content):
|
||||
viewmode = VIEW_XML
|
||||
return PRETTY_FUNCTION_MAP.get(viewmode, view_raw)
|
||||
|
||||
|
||||
def get_content_view(viewmode, hdrItems, content, limit):
|
||||
"""
|
||||
Returns a (msg, body) tuple.
|
||||
"""
|
||||
msg = []
|
||||
|
||||
hdrs = flow.ODictCaseless([list(i) for i in hdrItems])
|
||||
|
||||
enc = hdrs.get("content-encoding")
|
||||
if enc and enc[0] != "identity":
|
||||
decoded = encoding.decode(enc[0], content)
|
||||
if decoded:
|
||||
content = decoded
|
||||
msg.append("[decoded %s]"%enc[0])
|
||||
func = get_view_func(viewmode, hdrs, content)
|
||||
ret = func(hdrs, content, limit)
|
||||
if not ret:
|
||||
viewmode = VIEW_RAW
|
||||
ret = view_raw(hdrs, content, limit)
|
||||
msg.append("Couldn't parse: falling back to Raw")
|
||||
else:
|
||||
msg.append(ret[0])
|
||||
return " ".join(msg), ret[1]
|
||||
102
libmproxy/console/flowdetailview.py
Normal file
@@ -0,0 +1,102 @@
|
||||
# Copyright (C) 2012 Aldo Cortesi
|
||||
#
|
||||
# This program is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import urwid
|
||||
import common
|
||||
|
||||
footer = [
|
||||
('heading_key', "q"), ":back ",
|
||||
]
|
||||
|
||||
class FlowDetailsView(urwid.ListBox):
|
||||
def __init__(self, master, flow, state):
|
||||
self.master, self.flow, self.state = master, flow, state
|
||||
urwid.ListBox.__init__(
|
||||
self,
|
||||
self.flowtext()
|
||||
)
|
||||
|
||||
def keypress(self, size, key):
|
||||
key = common.shortcuts(key)
|
||||
if key == "q":
|
||||
self.master.statusbar = self.state[0]
|
||||
self.master.body = self.state[1]
|
||||
self.master.header = self.state[2]
|
||||
self.master.make_view()
|
||||
return None
|
||||
elif key == "?":
|
||||
key = None
|
||||
return urwid.ListBox.keypress(self, size, key)
|
||||
|
||||
def flowtext(self):
|
||||
text = []
|
||||
|
||||
title = urwid.Text("Flow details")
|
||||
title = urwid.Padding(title, align="left", width=("relative", 100))
|
||||
title = urwid.AttrWrap(title, "heading")
|
||||
text.append(title)
|
||||
|
||||
if self.flow.response:
|
||||
c = self.flow.response.get_cert()
|
||||
if c:
|
||||
text.append(urwid.Text([("head", "Server Certificate:")]))
|
||||
parts = [
|
||||
["Type", "%s, %s bits"%c.keyinfo],
|
||||
["SHA1 digest", c.digest("sha1")],
|
||||
["Valid to", str(c.notafter)],
|
||||
["Valid from", str(c.notbefore)],
|
||||
["Serial", str(c.serial)],
|
||||
]
|
||||
|
||||
parts.append(
|
||||
[
|
||||
"Subject",
|
||||
urwid.BoxAdapter(
|
||||
urwid.ListBox(common.format_keyvals(c.subject, key="highlight", val="text")),
|
||||
len(c.subject)
|
||||
)
|
||||
]
|
||||
)
|
||||
|
||||
parts.append(
|
||||
[
|
||||
"Issuer",
|
||||
urwid.BoxAdapter(
|
||||
urwid.ListBox(common.format_keyvals(c.issuer, key="highlight", val="text")),
|
||||
len(c.issuer)
|
||||
)
|
||||
]
|
||||
)
|
||||
|
||||
if c.altnames:
|
||||
parts.append(
|
||||
[
|
||||
"Alt names",
|
||||
", ".join(c.altnames)
|
||||
]
|
||||
)
|
||||
text.extend(common.format_keyvals(parts, key="key", val="text", indent=4))
|
||||
|
||||
if self.flow.request.client_conn:
|
||||
text.append(urwid.Text([("head", "Client Connection:")]))
|
||||
cc = self.flow.request.client_conn
|
||||
parts = [
|
||||
["Address", "%s:%s"%tuple(cc.address)],
|
||||
["Requests", "%s"%cc.requestcount],
|
||||
["Closed", "%s"%cc.close],
|
||||
]
|
||||
text.extend(common.format_keyvals(parts, key="key", val="text", indent=4))
|
||||
|
||||
return text
|
||||
@@ -40,6 +40,9 @@ def _mkhelp():
|
||||
return text
|
||||
help_context = _mkhelp()
|
||||
|
||||
footer = [
|
||||
('heading_key', "?"), ":help ",
|
||||
]
|
||||
|
||||
class EventListBox(urwid.ListBox):
|
||||
def __init__(self, master):
|
||||
@@ -65,7 +68,7 @@ class BodyPile(urwid.Pile):
|
||||
urwid.Pile.__init__(
|
||||
self,
|
||||
[
|
||||
ConnectionListBox(master),
|
||||
FlowListBox(master),
|
||||
urwid.Frame(EventListBox(master), header = self.inactive_header)
|
||||
]
|
||||
)
|
||||
@@ -122,13 +125,18 @@ class ConnectionItem(common.WWrap):
|
||||
self.master.currentflow = f
|
||||
self.master.focus_current()
|
||||
elif key == "r":
|
||||
self.flow.backup()
|
||||
r = self.master.replay_request(self.flow)
|
||||
if r:
|
||||
self.master.statusbar.message(r)
|
||||
self.master.sync_list_view()
|
||||
elif key == "V":
|
||||
if not self.flow.modified():
|
||||
self.master.statusbar.message("Flow not modified.")
|
||||
return
|
||||
self.state.revert(self.flow)
|
||||
self.master.sync_list_view()
|
||||
self.master.statusbar.message("Reverted.")
|
||||
elif key == "w":
|
||||
self.master.path_prompt(
|
||||
"Save flows: ",
|
||||
@@ -158,7 +166,7 @@ class ConnectionItem(common.WWrap):
|
||||
return key
|
||||
|
||||
|
||||
class ConnectionListView(urwid.ListWalker):
|
||||
class FlowListWalker(urwid.ListWalker):
|
||||
def __init__(self, master, state):
|
||||
self.master, self.state = master, state
|
||||
if self.state.flow_count():
|
||||
@@ -184,10 +192,10 @@ class ConnectionListView(urwid.ListWalker):
|
||||
return f, i
|
||||
|
||||
|
||||
class ConnectionListBox(urwid.ListBox):
|
||||
class FlowListBox(urwid.ListBox):
|
||||
def __init__(self, master):
|
||||
self.master = master
|
||||
urwid.ListBox.__init__(self, master.flow_list_view)
|
||||
urwid.ListBox.__init__(self, master.flow_list_walker)
|
||||
|
||||
def keypress(self, size, key):
|
||||
key = common.shortcuts(key)
|
||||
@@ -200,7 +208,6 @@ class ConnectionListBox(urwid.ListBox):
|
||||
self.master.toggle_eventlog()
|
||||
elif key == "l":
|
||||
self.master.prompt("Limit: ", self.master.state.limit_txt, self.master.set_limit)
|
||||
self.master.sync_list_view()
|
||||
elif key == "L":
|
||||
self.master.path_prompt(
|
||||
"Load flows: ",
|
||||
|
||||
@@ -13,10 +13,10 @@
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import os, re
|
||||
import os, sys
|
||||
import urwid
|
||||
import common
|
||||
from .. import utils, encoding, flow
|
||||
import common, grideditor, contentview
|
||||
from .. import utils, flow
|
||||
|
||||
def _mkhelp():
|
||||
text = []
|
||||
@@ -27,25 +27,48 @@ def _mkhelp():
|
||||
("d", "delete flow"),
|
||||
("D", "duplicate flow"),
|
||||
("e", "edit request/response"),
|
||||
("m", "change body display mode"),
|
||||
("f", "load full body data"),
|
||||
("m", "change body display mode for this entity"),
|
||||
(None,
|
||||
common.highlight_key("automatic", "a") +
|
||||
[("text", ": automatic detection")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("hex", "h") +
|
||||
[("text", ": Hex")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("image", "i") +
|
||||
[("text", ": Image")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("javascript", "j") +
|
||||
[("text", ": JavaScript")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("json", "s") +
|
||||
[("text", ": JSON")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("urlencoded", "u") +
|
||||
[("text", ": URL-encoded data")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("raw", "r") +
|
||||
[("text", ": raw data")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("pretty", "p") +
|
||||
[("text", ": pretty-print XML, HTML and JSON")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("hex", "h") +
|
||||
[("text", ": hex dump")]
|
||||
common.highlight_key("xml", "x") +
|
||||
[("text", ": XML")]
|
||||
),
|
||||
("M", "change default body display mode"),
|
||||
("p", "previous flow"),
|
||||
("r", "replay request"),
|
||||
("V", "revert changes to request"),
|
||||
("v", "view body in external viewer"),
|
||||
("w", "save all flows matching current limit"),
|
||||
("W", "save this flow"),
|
||||
("X", "view flow details"),
|
||||
("z", "encode/decode a request/response"),
|
||||
("tab", "toggle request/response view"),
|
||||
("space", "next flow"),
|
||||
@@ -55,10 +78,13 @@ def _mkhelp():
|
||||
return text
|
||||
help_context = _mkhelp()
|
||||
|
||||
footer = [
|
||||
('heading_key', "?"), ":help ",
|
||||
('heading_key', "q"), ":back ",
|
||||
]
|
||||
|
||||
VIEW_CUTOFF = 1024*100
|
||||
|
||||
class ConnectionViewHeader(common.WWrap):
|
||||
class FlowViewHeader(common.WWrap):
|
||||
def __init__(self, master, f):
|
||||
self.master, self.flow = master, f
|
||||
self.w = common.format_flow(f, False, extended=True, padding=0)
|
||||
@@ -69,13 +95,18 @@ class ConnectionViewHeader(common.WWrap):
|
||||
|
||||
|
||||
class CallbackCache:
|
||||
@utils.LRUCache(20)
|
||||
@utils.LRUCache(200)
|
||||
def _callback(self, method, *args, **kwargs):
|
||||
return getattr(self.obj, method)(*args, **kwargs)
|
||||
|
||||
def callback(self, obj, method, *args, **kwargs):
|
||||
return getattr(obj, method)(*args, **kwargs)
|
||||
# obj varies!
|
||||
self.obj = obj
|
||||
return self._callback(method, *args, **kwargs)
|
||||
cache = CallbackCache()
|
||||
|
||||
|
||||
class ConnectionView(common.WWrap):
|
||||
class FlowView(common.WWrap):
|
||||
REQ = 0
|
||||
RESP = 1
|
||||
method_options = [
|
||||
@@ -90,161 +121,67 @@ class ConnectionView(common.WWrap):
|
||||
]
|
||||
def __init__(self, master, state, flow):
|
||||
self.master, self.state, self.flow = master, state, flow
|
||||
if self.state.view_flow_mode == common.VIEW_FLOW_RESPONSE and flow.response:
|
||||
if self.state.view_flow_mode == common.VIEW_FLOW_RESPONSE:
|
||||
self.view_response()
|
||||
else:
|
||||
self.view_request()
|
||||
|
||||
def _trailer(self, clen, txt):
|
||||
rem = clen - VIEW_CUTOFF
|
||||
if rem > 0:
|
||||
txt.append(urwid.Text(""))
|
||||
txt.append(
|
||||
urwid.Text(
|
||||
[
|
||||
("highlight", "... %s of data not shown"%utils.pretty_size(rem))
|
||||
]
|
||||
def _cached_content_view(self, viewmode, hdrItems, content, limit):
|
||||
return contentview.get_content_view(viewmode, hdrItems, content, limit)
|
||||
|
||||
def content_view(self, viewmode, conn):
|
||||
full = self.state.get_flow_setting(
|
||||
self.flow,
|
||||
(self.state.view_flow_mode, "fullcontents"),
|
||||
False
|
||||
)
|
||||
if full:
|
||||
limit = sys.maxint
|
||||
else:
|
||||
limit = contentview.VIEW_CUTOFF
|
||||
return cache.callback(
|
||||
self, "_cached_content_view",
|
||||
viewmode,
|
||||
tuple(tuple(i) for i in conn.headers.lst),
|
||||
conn.content,
|
||||
limit
|
||||
)
|
||||
)
|
||||
|
||||
def _view_flow_raw(self, content):
|
||||
txt = []
|
||||
for i in utils.cleanBin(content[:VIEW_CUTOFF]).splitlines():
|
||||
txt.append(
|
||||
urwid.Text(("text", i))
|
||||
)
|
||||
self._trailer(len(content), txt)
|
||||
return txt
|
||||
|
||||
def _view_flow_binary(self, content):
|
||||
txt = []
|
||||
for offset, hexa, s in utils.hexdump(content[:VIEW_CUTOFF]):
|
||||
txt.append(urwid.Text([
|
||||
("offset", offset),
|
||||
" ",
|
||||
("text", hexa),
|
||||
" ",
|
||||
("text", s),
|
||||
]))
|
||||
self._trailer(len(content), txt)
|
||||
return txt
|
||||
|
||||
def _view_flow_xmlish(self, content):
|
||||
txt = []
|
||||
for i in utils.pretty_xmlish(content[:VIEW_CUTOFF]):
|
||||
txt.append(
|
||||
urwid.Text(("text", i)),
|
||||
)
|
||||
self._trailer(len(content), txt)
|
||||
return txt
|
||||
|
||||
def _view_flow_json(self, lines):
|
||||
txt = []
|
||||
sofar = 0
|
||||
for i in lines:
|
||||
sofar += len(i)
|
||||
txt.append(
|
||||
urwid.Text(("text", i)),
|
||||
)
|
||||
if sofar > VIEW_CUTOFF:
|
||||
break
|
||||
self._trailer(sum(len(i) for i in lines), txt)
|
||||
return txt
|
||||
|
||||
def _view_flow_formdata(self, content, boundary):
|
||||
rx = re.compile(r'\bname="([^"]+)"')
|
||||
keys = []
|
||||
vals = []
|
||||
|
||||
for i in content.split("--" + boundary):
|
||||
parts = i.splitlines()
|
||||
if len(parts) > 1 and parts[0][0:2] != "--":
|
||||
match = rx.search(parts[1])
|
||||
if match:
|
||||
keys.append(match.group(1) + ":")
|
||||
vals.append(utils.cleanBin(
|
||||
"\n".join(parts[3+parts[2:].index(""):])
|
||||
))
|
||||
r = [
|
||||
urwid.Text(("highlight", "Form data:\n")),
|
||||
]
|
||||
r.extend(common.format_keyvals(
|
||||
zip(keys, vals),
|
||||
key = "header",
|
||||
val = "text"
|
||||
))
|
||||
return r
|
||||
|
||||
def _view_flow_urlencoded(self, lines):
|
||||
return common.format_keyvals(
|
||||
[(k+":", v) for (k, v) in lines],
|
||||
key = "header",
|
||||
val = "text"
|
||||
)
|
||||
|
||||
|
||||
def _find_pretty_view(self, content, hdrItems):
|
||||
ctype = None
|
||||
for i in hdrItems:
|
||||
if i[0].lower() == "content-type":
|
||||
ctype = i[1]
|
||||
break
|
||||
if ctype and flow.HDR_FORM_URLENCODED in ctype:
|
||||
data = utils.urldecode(content)
|
||||
if data:
|
||||
return "URLEncoded form", self._view_flow_urlencoded(data)
|
||||
if utils.isXML(content):
|
||||
return "Indented XML-ish", self._view_flow_xmlish(content)
|
||||
elif ctype and "application/json" in ctype:
|
||||
lines = utils.pretty_json(content)
|
||||
if lines:
|
||||
return "JSON", self._view_flow_json(lines)
|
||||
elif ctype and "multipart/form-data" in ctype:
|
||||
boundary = ctype.split('boundary=')
|
||||
if len(boundary) > 1:
|
||||
return "Form data", self._view_flow_formdata(content, boundary[1].split(';')[0])
|
||||
return "", self._view_flow_raw(content)
|
||||
|
||||
def _cached_conn_text(self, e, content, hdrItems, viewmode):
|
||||
def conn_text(self, conn):
|
||||
txt = common.format_keyvals(
|
||||
[(h+":", v) for (h, v) in hdrItems],
|
||||
[(h+":", v) for (h, v) in conn.headers.lst],
|
||||
key = "header",
|
||||
val = "text"
|
||||
)
|
||||
if content:
|
||||
msg = ""
|
||||
if viewmode == common.VIEW_BODY_HEX:
|
||||
body = self._view_flow_binary(content)
|
||||
elif viewmode == common.VIEW_BODY_PRETTY:
|
||||
emsg = ""
|
||||
if e:
|
||||
decoded = encoding.decode(e, content)
|
||||
if decoded:
|
||||
content = decoded
|
||||
if e and e != "identity":
|
||||
emsg = "[decoded %s]"%e
|
||||
msg, body = self._find_pretty_view(content, hdrItems)
|
||||
if emsg:
|
||||
msg = emsg + " " + msg
|
||||
else:
|
||||
body = self._view_flow_raw(content)
|
||||
if conn.content:
|
||||
override = self.state.get_flow_setting(
|
||||
self.flow,
|
||||
(self.state.view_flow_mode, "prettyview"),
|
||||
)
|
||||
viewmode = self.state.default_body_view if override is None else override
|
||||
|
||||
title = urwid.AttrWrap(urwid.Columns([
|
||||
msg, body = self.content_view(viewmode, conn)
|
||||
|
||||
cols = [
|
||||
urwid.Text(
|
||||
[
|
||||
("heading", msg),
|
||||
]
|
||||
),
|
||||
urwid.Text(
|
||||
[
|
||||
" ",
|
||||
('heading', "["),
|
||||
('heading_key', "m"),
|
||||
('heading', (":%s]"%common.BODY_VIEWS[self.master.state.view_body_mode])),
|
||||
],
|
||||
align="right"
|
||||
),
|
||||
]), "heading")
|
||||
)
|
||||
]
|
||||
if override is not None:
|
||||
cols.append(
|
||||
urwid.Text(
|
||||
[
|
||||
" ",
|
||||
('heading', "["),
|
||||
('heading_key', "m"),
|
||||
('heading', (":%s]"%contentview.VIEW_NAMES[viewmode])),
|
||||
],
|
||||
align="right"
|
||||
)
|
||||
)
|
||||
title = urwid.AttrWrap(urwid.Columns(cols), "heading")
|
||||
txt.append(title)
|
||||
txt.extend(body)
|
||||
return urwid.ListBox(txt)
|
||||
@@ -283,33 +220,16 @@ class ConnectionView(common.WWrap):
|
||||
)
|
||||
return f
|
||||
|
||||
def _conn_text(self, conn, viewmode):
|
||||
e = conn.headers["content-encoding"]
|
||||
e = e[0] if e else None
|
||||
return cache.callback(
|
||||
self, "_cached_conn_text",
|
||||
e,
|
||||
conn.content,
|
||||
tuple(tuple(i) for i in conn.headers.lst),
|
||||
viewmode
|
||||
)
|
||||
|
||||
def view_request(self):
|
||||
self.state.view_flow_mode = common.VIEW_FLOW_REQUEST
|
||||
body = self._conn_text(
|
||||
self.flow.request,
|
||||
self.state.view_body_mode
|
||||
)
|
||||
body = self.conn_text(self.flow.request)
|
||||
self.w = self.wrap_body(common.VIEW_FLOW_REQUEST, body)
|
||||
self.master.statusbar.redraw()
|
||||
|
||||
def view_response(self):
|
||||
self.state.view_flow_mode = common.VIEW_FLOW_RESPONSE
|
||||
if self.flow.response:
|
||||
body = self._conn_text(
|
||||
self.flow.response,
|
||||
self.state.view_body_mode
|
||||
)
|
||||
body = self.conn_text(self.flow.response)
|
||||
else:
|
||||
body = urwid.ListBox(
|
||||
[
|
||||
@@ -386,7 +306,7 @@ class ConnectionView(common.WWrap):
|
||||
self.master.refresh_flow(self.flow)
|
||||
|
||||
def set_headers(self, lst, conn):
|
||||
conn.headers = flow.ODict(lst)
|
||||
conn.headers = flow.ODictCaseless(lst)
|
||||
|
||||
def set_query(self, lst, conn):
|
||||
conn.set_query(flow.ODict(lst))
|
||||
@@ -395,7 +315,9 @@ class ConnectionView(common.WWrap):
|
||||
conn.set_form_urlencoded(flow.ODict(lst))
|
||||
|
||||
def edit_form(self, conn):
|
||||
self.master.view_kveditor("Editing form", conn.get_form_urlencoded().lst, self.set_form, conn)
|
||||
self.master.view_grideditor(
|
||||
grideditor.URLEncodedFormEditor(self.master, conn.get_form_urlencoded().lst, self.set_form, conn)
|
||||
)
|
||||
|
||||
def edit_form_confirm(self, key, conn):
|
||||
if key == "y":
|
||||
@@ -406,7 +328,7 @@ class ConnectionView(common.WWrap):
|
||||
conn = self.flow.request
|
||||
else:
|
||||
if not self.flow.response:
|
||||
self.flow.response = flow.Response(self.flow.request, 200, "OK", flow.ODict(), "")
|
||||
self.flow.response = flow.Response(self.flow.request, 200, "OK", flow.ODictCaseless(), "")
|
||||
conn = self.flow.response
|
||||
|
||||
self.flow.backup()
|
||||
@@ -427,9 +349,9 @@ class ConnectionView(common.WWrap):
|
||||
else:
|
||||
self.edit_form(conn)
|
||||
elif part == "h":
|
||||
self.master.view_kveditor("Editing headers", conn.headers.lst, self.set_headers, conn)
|
||||
self.master.view_grideditor(grideditor.HeaderEditor(self.master, conn.headers.lst, self.set_headers, conn))
|
||||
elif part == "q":
|
||||
self.master.view_kveditor("Editing query", conn.get_query().lst, self.set_query, conn)
|
||||
self.master.view_grideditor(grideditor.QueryEditor(self.master, conn.get_query().lst, self.set_query, conn))
|
||||
elif part == "u" and self.state.view_flow_mode == common.VIEW_FLOW_REQUEST:
|
||||
self.master.prompt_edit("URL", conn.get_url(), self.set_url)
|
||||
elif part == "m" and self.state.view_flow_mode == common.VIEW_FLOW_REQUEST:
|
||||
@@ -450,6 +372,7 @@ class ConnectionView(common.WWrap):
|
||||
else:
|
||||
new_flow, new_idx = self.state.get_prev(idx)
|
||||
if new_idx is None:
|
||||
self.master.statusbar.message("No more flows!")
|
||||
return
|
||||
self.master.view_flow(new_flow)
|
||||
|
||||
@@ -459,10 +382,18 @@ class ConnectionView(common.WWrap):
|
||||
def view_prev_flow(self, flow):
|
||||
return self._view_nextprev_flow("prev", flow)
|
||||
|
||||
def change_this_display_mode(self, t):
|
||||
self.state.add_flow_setting(
|
||||
self.flow,
|
||||
(self.state.view_flow_mode, "prettyview"),
|
||||
contentview.VIEW_SHORTCUTS.get(t)
|
||||
)
|
||||
self.master.refresh_flow(self.flow)
|
||||
|
||||
def keypress(self, size, key):
|
||||
if key == " ":
|
||||
self.view_next_flow(self.flow)
|
||||
return key
|
||||
return
|
||||
|
||||
key = common.shortcuts(key)
|
||||
if self.state.view_flow_mode == common.VIEW_FLOW_REQUEST:
|
||||
@@ -487,6 +418,20 @@ class ConnectionView(common.WWrap):
|
||||
elif key == "A":
|
||||
self.master.accept_all()
|
||||
self.master.view_flow(self.flow)
|
||||
elif key == "b":
|
||||
if conn:
|
||||
if self.state.view_flow_mode == common.VIEW_FLOW_REQUEST:
|
||||
self.master.path_prompt(
|
||||
"Save request body: ",
|
||||
self.state.last_saveload,
|
||||
self.save_body
|
||||
)
|
||||
else:
|
||||
self.master.path_prompt(
|
||||
"Save response body: ",
|
||||
self.state.last_saveload,
|
||||
self.save_body
|
||||
)
|
||||
elif key == "d":
|
||||
if self.state.flow_count() == 1:
|
||||
self.master.view_flowlist()
|
||||
@@ -528,27 +473,39 @@ class ConnectionView(common.WWrap):
|
||||
self.edit
|
||||
)
|
||||
key = None
|
||||
elif key == "f":
|
||||
self.master.statusbar.message("Loading all body data...")
|
||||
self.state.add_flow_setting(
|
||||
self.flow,
|
||||
(self.state.view_flow_mode, "fullcontents"),
|
||||
True
|
||||
)
|
||||
self.master.refresh_flow(self.flow)
|
||||
self.master.statusbar.message("")
|
||||
elif key == "m":
|
||||
p = list(contentview.VIEW_PROMPT)
|
||||
p.insert(0, ("clear", "c"))
|
||||
self.master.prompt_onekey(
|
||||
"View",
|
||||
(
|
||||
("raw", "r"),
|
||||
("pretty", "p"),
|
||||
("hex", "h"),
|
||||
),
|
||||
self.master.changeview
|
||||
"Display mode",
|
||||
p,
|
||||
self.change_this_display_mode
|
||||
)
|
||||
key = None
|
||||
elif key == "p":
|
||||
self.view_prev_flow(self.flow)
|
||||
elif key == "r":
|
||||
self.flow.backup()
|
||||
r = self.master.replay_request(self.flow)
|
||||
if r:
|
||||
self.master.statusbar.message(r)
|
||||
self.master.refresh_flow(self.flow)
|
||||
elif key == "V":
|
||||
if not self.flow.modified():
|
||||
self.master.statusbar.message("Flow not modified.")
|
||||
return
|
||||
self.state.revert(self.flow)
|
||||
self.master.refresh_flow(self.flow)
|
||||
self.master.statusbar.message("Reverted.")
|
||||
elif key == "W":
|
||||
self.master.path_prompt(
|
||||
"Save this flow: ",
|
||||
@@ -561,27 +518,16 @@ class ConnectionView(common.WWrap):
|
||||
t = conn.headers["content-type"] or [None]
|
||||
t = t[0]
|
||||
self.master.spawn_external_viewer(conn.content, t)
|
||||
elif key == "b":
|
||||
if conn:
|
||||
if self.state.view_flow_mode == common.VIEW_FLOW_REQUEST:
|
||||
self.master.path_prompt(
|
||||
"Save request body: ",
|
||||
self.state.last_saveload,
|
||||
self.save_body
|
||||
)
|
||||
else:
|
||||
self.master.path_prompt(
|
||||
"Save response body: ",
|
||||
self.state.last_saveload,
|
||||
self.save_body
|
||||
)
|
||||
elif key == "|":
|
||||
self.master.path_prompt(
|
||||
"Send flow to script: ", self.state.last_script,
|
||||
self.master.run_script_once, self.flow
|
||||
)
|
||||
elif key == "X":
|
||||
self.master.view_flowdetails(self.flow)
|
||||
elif key == "z":
|
||||
if conn:
|
||||
self.flow.backup()
|
||||
e = conn.headers["content-encoding"] or ["identity"]
|
||||
if e[0] != "identity":
|
||||
conn.decode()
|
||||
|
||||
@@ -13,10 +13,10 @@
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import copy
|
||||
import copy, re
|
||||
import urwid
|
||||
import common
|
||||
from .. import utils
|
||||
from .. import utils, filt
|
||||
|
||||
|
||||
def _mkhelp():
|
||||
@@ -35,12 +35,25 @@ def _mkhelp():
|
||||
return text
|
||||
help_context = _mkhelp()
|
||||
|
||||
footer = [
|
||||
('heading_key', "enter"), ":edit ",
|
||||
('heading_key', "q"), ":back ",
|
||||
]
|
||||
footer_editing = [
|
||||
('heading_key', "esc"), ":stop editing ",
|
||||
]
|
||||
|
||||
|
||||
class SText(common.WWrap):
|
||||
def __init__(self, txt, focused):
|
||||
def __init__(self, txt, focused, error):
|
||||
w = urwid.Text(txt, wrap="any")
|
||||
if focused:
|
||||
w = urwid.AttrWrap(w, "focusfield")
|
||||
if error:
|
||||
w = urwid.AttrWrap(w, "focusfield_error")
|
||||
else:
|
||||
w = urwid.AttrWrap(w, "focusfield")
|
||||
elif error:
|
||||
w = urwid.AttrWrap(w, "field_error")
|
||||
common.WWrap.__init__(self, w)
|
||||
|
||||
def get_text(self):
|
||||
@@ -66,49 +79,60 @@ class SEdit(common.WWrap):
|
||||
return True
|
||||
|
||||
|
||||
class KVItem(common.WWrap):
|
||||
def __init__(self, focused, editing, maxk, k, v):
|
||||
self.focused, self.editing, self.maxk = focused, editing, maxk
|
||||
if focused == 0 and editing:
|
||||
self.editing = self.kf = SEdit(k)
|
||||
else:
|
||||
self.kf = SText(k, True if focused == 0 else False)
|
||||
class GridRow(common.WWrap):
|
||||
def __init__(self, focused, editing, editor, values):
|
||||
self.focused, self.editing, self.editor = focused, editing, editor
|
||||
|
||||
if focused == 1 and editing:
|
||||
self.editing = self.vf = SEdit(v)
|
||||
else:
|
||||
self.vf = SText(v, True if focused == 1 else False)
|
||||
errors = values[1]
|
||||
self.fields = []
|
||||
for i, v in enumerate(values[0]):
|
||||
if focused == i and editing:
|
||||
self.editing = SEdit(v)
|
||||
self.fields.append(self.editing)
|
||||
else:
|
||||
self.fields.append(
|
||||
SText(v, True if focused == i else False, i in errors)
|
||||
)
|
||||
|
||||
fspecs = self.fields[:]
|
||||
fspecs[0] = ("fixed", self.editor.first_width + 2, fspecs[0])
|
||||
w = urwid.Columns(
|
||||
[
|
||||
("fixed", maxk + 2, self.kf),
|
||||
self.vf
|
||||
],
|
||||
fspecs,
|
||||
dividechars = 2
|
||||
)
|
||||
if focused is not None:
|
||||
w.set_focus_column(focused)
|
||||
common.WWrap.__init__(self, w)
|
||||
|
||||
def get_kv(self):
|
||||
return (self.kf.get_text(), self.vf.get_text())
|
||||
def get_value(self):
|
||||
vals = []
|
||||
errors = set([])
|
||||
for i, f in enumerate(self.fields):
|
||||
v = f.get_text()
|
||||
vals.append(v)
|
||||
if self.editor.is_error(i, v):
|
||||
errors.add(i)
|
||||
return [vals, errors]
|
||||
|
||||
def keypress(self, s, k):
|
||||
if self.editing:
|
||||
k = self.editing.keypress((s[0]-self.maxk-4,), k)
|
||||
w = self.w.column_widths(s)[self.focused]
|
||||
k = self.editing.keypress((w,), k)
|
||||
return k
|
||||
|
||||
def selectable(self):
|
||||
return True
|
||||
|
||||
|
||||
KEY_MAX = 30
|
||||
class KVWalker(urwid.ListWalker):
|
||||
class GridWalker(urwid.ListWalker):
|
||||
"""
|
||||
Stores rows as a list of (rows, errors) tuples, where rows is a list
|
||||
and errors is a set with an entry of each offset in rows that is an
|
||||
error.
|
||||
"""
|
||||
def __init__(self, lst, editor):
|
||||
self.lst, self.editor = lst, editor
|
||||
self.maxk = min(max(len(v[0]) for v in lst), KEY_MAX) if lst else 20
|
||||
if self.maxk < 20:
|
||||
self.maxk = 20
|
||||
self.lst = [(i, set([])) for i in lst]
|
||||
self.editor = editor
|
||||
self.focus = 0
|
||||
self.focus_col = 0
|
||||
self.editing = False
|
||||
@@ -119,12 +143,12 @@ class KVWalker(urwid.ListWalker):
|
||||
|
||||
def get_current_value(self):
|
||||
if self.lst:
|
||||
return self.lst[self.focus][self.focus_col]
|
||||
return self.lst[self.focus][0][self.focus_col]
|
||||
|
||||
def set_current_value(self, val):
|
||||
row = list(self.lst[self.focus])
|
||||
row = list(self.lst[self.focus][0])
|
||||
row[self.focus_col] = val
|
||||
self.lst[self.focus] = tuple(row)
|
||||
self.lst[self.focus] = [tuple(row), set([])]
|
||||
|
||||
def delete_focus(self):
|
||||
if self.lst:
|
||||
@@ -134,7 +158,7 @@ class KVWalker(urwid.ListWalker):
|
||||
|
||||
def _insert(self, pos):
|
||||
self.focus = pos
|
||||
self.lst.insert(self.focus, ("", ""))
|
||||
self.lst.insert(self.focus, [[""]*self.editor.columns, set([])])
|
||||
self.focus_col = 0
|
||||
self.start_edit()
|
||||
|
||||
@@ -146,27 +170,29 @@ class KVWalker(urwid.ListWalker):
|
||||
|
||||
def start_edit(self):
|
||||
if self.lst:
|
||||
self.editing = KVItem(self.focus_col, True, self.maxk, *self.lst[self.focus])
|
||||
self.editing = GridRow(self.focus_col, True, self.editor, self.lst[self.focus])
|
||||
self.editor.master.statusbar.update(footer_editing)
|
||||
self._modified()
|
||||
|
||||
def stop_edit(self):
|
||||
if self.editing:
|
||||
self.lst[self.focus] = self.editing.get_kv()
|
||||
self.editor.master.statusbar.update(footer)
|
||||
self.lst[self.focus] = self.editing.get_value()
|
||||
self.editing = False
|
||||
self._modified()
|
||||
|
||||
def left(self):
|
||||
self.focus_col = 0
|
||||
self.focus_col = max(self.focus_col - 1, 0)
|
||||
self._modified()
|
||||
|
||||
def right(self):
|
||||
self.focus_col = 1
|
||||
self.focus_col = min(self.focus_col + 1, self.editor.columns-1)
|
||||
self._modified()
|
||||
|
||||
def tab_next(self):
|
||||
self.stop_edit()
|
||||
if self.focus_col == 0:
|
||||
self.focus_col = 1
|
||||
if self.focus_col < self.editor.columns-1:
|
||||
self.focus_col += 1
|
||||
elif self.focus != len(self.lst)-1:
|
||||
self.focus_col = 0
|
||||
self.focus += 1
|
||||
@@ -176,7 +202,7 @@ class KVWalker(urwid.ListWalker):
|
||||
if self.editing:
|
||||
return self.editing, self.focus
|
||||
elif self.lst:
|
||||
return KVItem(self.focus_col, False, self.maxk, *self.lst[self.focus]), self.focus
|
||||
return GridRow(self.focus_col, False, self.editor, self.lst[self.focus]), self.focus
|
||||
else:
|
||||
return None, None
|
||||
|
||||
@@ -187,30 +213,57 @@ class KVWalker(urwid.ListWalker):
|
||||
def get_next(self, pos):
|
||||
if pos+1 >= len(self.lst):
|
||||
return None, None
|
||||
return KVItem(None, False, self.maxk, *self.lst[pos+1]), pos+1
|
||||
return GridRow(None, False, self.editor, self.lst[pos+1]), pos+1
|
||||
|
||||
def get_prev(self, pos):
|
||||
if pos-1 < 0:
|
||||
return None, None
|
||||
return KVItem(None, False, self.maxk, *self.lst[pos-1]), pos-1
|
||||
return GridRow(None, False, self.editor, self.lst[pos-1]), pos-1
|
||||
|
||||
|
||||
class KVListBox(urwid.ListBox):
|
||||
class GridListBox(urwid.ListBox):
|
||||
def __init__(self, lw):
|
||||
urwid.ListBox.__init__(self, lw)
|
||||
|
||||
|
||||
class KVEditor(common.WWrap):
|
||||
def __init__(self, master, title, value, callback, *cb_args, **cb_kwargs):
|
||||
FIRST_WIDTH_MAX = 40
|
||||
FIRST_WIDTH_MIN = 20
|
||||
class GridEditor(common.WWrap):
|
||||
def __init__(self, master, value, callback, *cb_args, **cb_kwargs):
|
||||
value = copy.deepcopy(value)
|
||||
self.master, self.title, self.value, self.callback = master, title, value, callback
|
||||
self.master, self.value, self.callback = master, value, callback
|
||||
self.cb_args, self.cb_kwargs = cb_args, cb_kwargs
|
||||
p = urwid.Text(title)
|
||||
p = urwid.Padding(p, align="left", width=("relative", 100))
|
||||
p = urwid.AttrWrap(p, "heading")
|
||||
self.walker = KVWalker(self.value, self)
|
||||
self.lb = KVListBox(self.walker)
|
||||
self.w = urwid.Frame(self.lb, header = p)
|
||||
|
||||
first_width = 20
|
||||
if value:
|
||||
for r in value:
|
||||
assert len(r) == self.columns
|
||||
first_width = max(len(r), first_width)
|
||||
self.first_width = min(first_width, FIRST_WIDTH_MAX)
|
||||
|
||||
title = urwid.Text(self.title)
|
||||
title = urwid.Padding(title, align="left", width=("relative", 100))
|
||||
title = urwid.AttrWrap(title, "heading")
|
||||
|
||||
headings = []
|
||||
for i, h in enumerate(self.headings):
|
||||
c = urwid.Text(h)
|
||||
if i == 0:
|
||||
headings.append(("fixed", first_width + 2, c))
|
||||
else:
|
||||
headings.append(c)
|
||||
h = urwid.Columns(
|
||||
headings,
|
||||
dividechars = 2
|
||||
)
|
||||
h = urwid.AttrWrap(h, "heading")
|
||||
|
||||
self.walker = GridWalker(self.value, self)
|
||||
self.lb = GridListBox(self.walker)
|
||||
self.w = urwid.Frame(
|
||||
self.lb,
|
||||
header = urwid.Pile([title, h])
|
||||
)
|
||||
self.master.statusbar.update("")
|
||||
self.show_empty_msg()
|
||||
|
||||
@@ -230,7 +283,7 @@ class KVEditor(common.WWrap):
|
||||
|
||||
def keypress(self, size, key):
|
||||
if self.walker.editing:
|
||||
if key in ["esc", "enter"]:
|
||||
if key in ["esc"]:
|
||||
self.walker.stop_edit()
|
||||
elif key == "tab":
|
||||
pf, pfc = self.walker.focus, self.walker.focus_col
|
||||
@@ -243,7 +296,11 @@ class KVEditor(common.WWrap):
|
||||
|
||||
key = common.shortcuts(key)
|
||||
if key in ["q", "esc"]:
|
||||
self.callback(self.walker.lst, *self.cb_args, **self.cb_kwargs)
|
||||
res = []
|
||||
for i in self.walker.lst:
|
||||
if any([x.strip() for x in i[0]]):
|
||||
res.append(i[0])
|
||||
self.callback(res, *self.cb_args, **self.cb_kwargs)
|
||||
self.master.pop_view()
|
||||
elif key in ["h", "left"]:
|
||||
self.walker.left()
|
||||
@@ -268,3 +325,41 @@ class KVEditor(common.WWrap):
|
||||
self.walker.start_edit()
|
||||
else:
|
||||
return self.w.keypress(size, key)
|
||||
|
||||
def is_error(self, col, val):
|
||||
return False
|
||||
|
||||
|
||||
class QueryEditor(GridEditor):
|
||||
title = "Editing query"
|
||||
columns = 2
|
||||
headings = ("Key", "Value")
|
||||
|
||||
|
||||
class HeaderEditor(GridEditor):
|
||||
title = "Editing headers"
|
||||
columns = 2
|
||||
headings = ("Key", "Value")
|
||||
|
||||
|
||||
class URLEncodedFormEditor(GridEditor):
|
||||
title = "Editing URL-encoded form"
|
||||
columns = 2
|
||||
headings = ("Key", "Value")
|
||||
|
||||
|
||||
class ReplaceEditor(GridEditor):
|
||||
title = "Editing replacement patterns"
|
||||
columns = 3
|
||||
headings = ("Filter", "Regex", "Replacement")
|
||||
def is_error(self, col, val):
|
||||
if col == 0:
|
||||
if not filt.parse(val):
|
||||
return True
|
||||
elif col == 1:
|
||||
try:
|
||||
re.compile(val)
|
||||
except re.error:
|
||||
return True
|
||||
return False
|
||||
|
||||
@@ -15,7 +15,12 @@
|
||||
|
||||
import urwid
|
||||
import common
|
||||
from .. import filt
|
||||
from .. import filt, version
|
||||
|
||||
footer = [
|
||||
("heading", 'mitmproxy v%s '%version.VERSION),
|
||||
('heading_key', "q"), ":back ",
|
||||
]
|
||||
|
||||
class HelpView(urwid.ListBox):
|
||||
def __init__(self, master, help_context, state):
|
||||
@@ -34,6 +39,8 @@ class HelpView(urwid.ListBox):
|
||||
self.master.header = self.state[2]
|
||||
self.master.make_view()
|
||||
return None
|
||||
elif key == "?":
|
||||
key = None
|
||||
return urwid.ListBox.keypress(self, size, key)
|
||||
|
||||
def helptext(self):
|
||||
@@ -55,28 +62,65 @@ class HelpView(urwid.ListBox):
|
||||
keys = [
|
||||
("c", "client replay"),
|
||||
("i", "set interception pattern"),
|
||||
|
||||
("M", "change global default display mode"),
|
||||
(None,
|
||||
common.highlight_key("automatic", "a") +
|
||||
[("text", ": automatic detection")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("hex", "h") +
|
||||
[("text", ": Hex")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("image", "i") +
|
||||
[("text", ": Image")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("javascript", "j") +
|
||||
[("text", ": JavaScript")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("json", "s") +
|
||||
[("text", ": JSON")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("urlencoded", "u") +
|
||||
[("text", ": URL-encoded data")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("raw", "r") +
|
||||
[("text", ": raw data")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("xml", "x") +
|
||||
[("text", ": XML")]
|
||||
),
|
||||
("o", "toggle options:"),
|
||||
(None,
|
||||
common.highlight_key("anticache", "a") +
|
||||
[("text", ": prevent cached responses")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("anticomp", "c") +
|
||||
[("text", ": prevent compressed responses")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("killextra", "k") +
|
||||
[("text", ": kill requests not part of server replay")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("norefresh", "n") +
|
||||
[("text", ": disable server replay response refresh")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("anticache", "a") +
|
||||
[("text", ": prevent cached responses")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("anticomp", "c") +
|
||||
[("text", ": prevent compressed responses")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("killextra", "k") +
|
||||
[("text", ": kill requests not part of server replay")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("norefresh", "n") +
|
||||
[("text", ": disable server replay response refresh")]
|
||||
),
|
||||
(None,
|
||||
common.highlight_key("upstream certs", "u") +
|
||||
[("text", ": sniff cert info from upstream server")]
|
||||
),
|
||||
|
||||
("q", "quit / return to flow list"),
|
||||
("Q", "quit without confirm prompt"),
|
||||
("R", "set reverse proxy mode"),
|
||||
("P", "set reverse proxy mode"),
|
||||
("R", "edit replacement patterns"),
|
||||
("s", "set/unset script"),
|
||||
("S", "server replay"),
|
||||
("t", "set sticky cookie expression"),
|
||||
|
||||
@@ -33,8 +33,15 @@ dark = [
|
||||
# List and Connections
|
||||
('method', 'dark cyan', 'default'),
|
||||
('focus', 'yellow', 'default'),
|
||||
('goodcode', 'light green', 'default'),
|
||||
|
||||
('code_200', 'light green', 'default'),
|
||||
('code_300', 'light blue', 'default'),
|
||||
('code_400', 'light red', 'default', None, "#f60", "default"),
|
||||
('code_500', 'light red', 'default'),
|
||||
('code_other', 'dark red', 'default'),
|
||||
|
||||
('error', 'light red', 'default'),
|
||||
|
||||
('header', 'dark cyan', 'default'),
|
||||
('highlight', 'white,bold', 'default'),
|
||||
('intercept', 'brown', 'default', None, "#f60", "default"),
|
||||
@@ -44,7 +51,9 @@ dark = [
|
||||
# Hex view
|
||||
('offset', 'dark cyan', 'default'),
|
||||
|
||||
# KV Editor
|
||||
# Grid Editor
|
||||
('focusfield', 'black', 'light gray'),
|
||||
('focusfield_error', 'dark red', 'light gray'),
|
||||
('field_error', 'dark red', 'black'),
|
||||
('editfield', 'black', 'light cyan'),
|
||||
]
|
||||
|
||||
10
libmproxy/contrib/README
Normal file
@@ -0,0 +1,10 @@
|
||||
|
||||
Contribs:
|
||||
|
||||
pyparsing 1.5.2, MIT license
|
||||
|
||||
jsbeautifier, git checkout 25/03/12, MIT license
|
||||
- Removed test directories
|
||||
- Disabled packers through a single-line modification (see "# CORTESI"
|
||||
comment)
|
||||
|
||||
1153
libmproxy/contrib/jsbeautifier/__init__.py
Normal file
25
libmproxy/contrib/jsbeautifier/unpackers/README.specs.mkd
Normal file
@@ -0,0 +1,25 @@
|
||||
# UNPACKERS SPECIFICATIONS
|
||||
|
||||
Nothing very difficult: an unpacker is a submodule placed in the directory
|
||||
where this file was found. Each unpacker must define three symbols:
|
||||
|
||||
* `PRIORITY` : integer number expressing the priority in applying this
|
||||
unpacker. Lower number means higher priority.
|
||||
Makes sense only if a source file has been packed with
|
||||
more than one packer.
|
||||
* `detect(source)` : returns `True` if source is packed, otherwise, `False`.
|
||||
* `unpack(source)` : takes a `source` string and unpacks it. Must always return
|
||||
valid JavaScript. That is to say, your code should look
|
||||
like:
|
||||
|
||||
```
|
||||
if detect(source):
|
||||
return do_your_fancy_things_with(source)
|
||||
else:
|
||||
return source
|
||||
```
|
||||
|
||||
*You can safely define any other symbol in your module, as it will be ignored.*
|
||||
|
||||
`__init__` code will automatically load new unpackers, without any further step
|
||||
to be accomplished. Simply drop it in this directory.
|
||||
67
libmproxy/contrib/jsbeautifier/unpackers/__init__.py
Normal file
@@ -0,0 +1,67 @@
|
||||
#
|
||||
# General code for JSBeautifier unpackers infrastructure. See README.specs
|
||||
# written by Stefano Sanfilippo <a.little.coder@gmail.com>
|
||||
#
|
||||
|
||||
"""General code for JSBeautifier unpackers infrastructure."""
|
||||
|
||||
import pkgutil
|
||||
import re
|
||||
from jsbeautifier.unpackers import evalbased
|
||||
|
||||
# NOTE: AT THE MOMENT, IT IS DEACTIVATED FOR YOUR SECURITY: it runs js!
|
||||
BLACKLIST = ['jsbeautifier.unpackers.evalbased']
|
||||
|
||||
class UnpackingError(Exception):
|
||||
"""Badly packed source or general error. Argument is a
|
||||
meaningful description."""
|
||||
pass
|
||||
|
||||
def getunpackers():
|
||||
"""Scans the unpackers dir, finds unpackers and add them to UNPACKERS list.
|
||||
An unpacker will be loaded only if it is a valid python module (name must
|
||||
adhere to naming conventions) and it is not blacklisted (i.e. inserted
|
||||
into BLACKLIST."""
|
||||
path = __path__
|
||||
prefix = __name__ + '.'
|
||||
unpackers = []
|
||||
interface = ['unpack', 'detect', 'PRIORITY']
|
||||
for _importer, modname, _ispkg in pkgutil.iter_modules(path, prefix):
|
||||
if 'tests' not in modname and modname not in BLACKLIST:
|
||||
try:
|
||||
module = __import__(modname, fromlist=interface)
|
||||
except ImportError:
|
||||
raise UnpackingError('Bad unpacker: %s' % modname)
|
||||
else:
|
||||
unpackers.append(module)
|
||||
|
||||
return sorted(unpackers, key = lambda mod: mod.PRIORITY)
|
||||
|
||||
UNPACKERS = getunpackers()
|
||||
|
||||
def run(source, evalcode=False):
|
||||
"""Runs the applicable unpackers and return unpacked source as a string."""
|
||||
for unpacker in [mod for mod in UNPACKERS if mod.detect(source)]:
|
||||
source = unpacker.unpack(source)
|
||||
if evalcode and evalbased.detect(source):
|
||||
source = evalbased.unpack(source)
|
||||
return source
|
||||
|
||||
def filtercomments(source):
|
||||
"""NOT USED: strips trailing comments and put them at the top."""
|
||||
trailing_comments = []
|
||||
comment = True
|
||||
|
||||
while comment:
|
||||
if re.search(r'^\s*\/\*', source):
|
||||
comment = source[0, source.index('*/') + 2]
|
||||
elif re.search(r'^\s*\/\/', source):
|
||||
comment = re.search(r'^\s*\/\/', source).group(0)
|
||||
else:
|
||||
comment = None
|
||||
|
||||
if comment:
|
||||
source = re.sub(r'^\s+', '', source[len(comment):])
|
||||
trailing_comments.append(comment)
|
||||
|
||||
return '\n'.join(trailing_comments) + source
|
||||
39
libmproxy/contrib/jsbeautifier/unpackers/evalbased.py
Normal file
@@ -0,0 +1,39 @@
|
||||
#
|
||||
# Unpacker for eval() based packers, a part of javascript beautifier
|
||||
# by Einar Lielmanis <einar@jsbeautifier.org>
|
||||
#
|
||||
# written by Stefano Sanfilippo <a.little.coder@gmail.com>
|
||||
#
|
||||
# usage:
|
||||
#
|
||||
# if detect(some_string):
|
||||
# unpacked = unpack(some_string)
|
||||
#
|
||||
|
||||
"""Unpacker for eval() based packers: runs JS code and returns result.
|
||||
Works only if a JS interpreter (e.g. Mozilla's Rhino) is installed and
|
||||
properly set up on host."""
|
||||
|
||||
from subprocess import PIPE, Popen
|
||||
|
||||
PRIORITY = 3
|
||||
|
||||
def detect(source):
|
||||
"""Detects if source is likely to be eval() packed."""
|
||||
return source.strip().lower().startswith('eval(function(')
|
||||
|
||||
def unpack(source):
|
||||
"""Runs source and return resulting code."""
|
||||
return jseval('print %s;' % source[4:]) if detect(source) else source
|
||||
|
||||
# In case of failure, we'll just return the original, without crashing on user.
|
||||
def jseval(script):
|
||||
"""Run code in the JS interpreter and return output."""
|
||||
try:
|
||||
interpreter = Popen(['js'], stdin=PIPE, stdout=PIPE)
|
||||
except OSError:
|
||||
return script
|
||||
result, errors = interpreter.communicate(script)
|
||||
if interpreter.poll() or errors:
|
||||
return script
|
||||
return result
|
||||
@@ -0,0 +1,58 @@
|
||||
#
|
||||
# simple unpacker/deobfuscator for scripts messed up with
|
||||
# javascriptobfuscator.com
|
||||
#
|
||||
# written by Einar Lielmanis <einar@jsbeautifier.org>
|
||||
# rewritten in Python by Stefano Sanfilippo <a.little.coder@gmail.com>
|
||||
#
|
||||
# Will always return valid javascript: if `detect()` is false, `code` is
|
||||
# returned, unmodified.
|
||||
#
|
||||
# usage:
|
||||
#
|
||||
# if javascriptobfuscator.detect(some_string):
|
||||
# some_string = javascriptobfuscator.unpack(some_string)
|
||||
#
|
||||
|
||||
"""deobfuscator for scripts messed up with JavascriptObfuscator.com"""
|
||||
|
||||
import re
|
||||
|
||||
PRIORITY = 1
|
||||
|
||||
def smartsplit(code):
|
||||
"""Split `code` at " symbol, only if it is not escaped."""
|
||||
strings = []
|
||||
pos = 0
|
||||
while pos < len(code):
|
||||
if code[pos] == '"':
|
||||
word = '' # new word
|
||||
pos += 1
|
||||
while pos < len(code):
|
||||
if code[pos] == '"':
|
||||
break
|
||||
if code[pos] == '\\':
|
||||
word += '\\'
|
||||
pos += 1
|
||||
word += code[pos]
|
||||
pos += 1
|
||||
strings.append('"%s"' % word)
|
||||
pos += 1
|
||||
return strings
|
||||
|
||||
def detect(code):
|
||||
"""Detects if `code` is JavascriptObfuscator.com packed."""
|
||||
# prefer `is not` idiom, so that a true boolean is returned
|
||||
return (re.search(r'^var _0x[a-f0-9]+ ?\= ?\[', code) is not None)
|
||||
|
||||
def unpack(code):
|
||||
"""Unpacks JavascriptObfuscator.com packed code."""
|
||||
if detect(code):
|
||||
matches = re.search(r'var (_0x[a-f\d]+) ?\= ?\[(.*?)\];', code)
|
||||
if matches:
|
||||
variable = matches.group(1)
|
||||
dictionary = smartsplit(matches.group(2))
|
||||
code = code[len(matches.group(0)):]
|
||||
for key, value in enumerate(dictionary):
|
||||
code = code.replace(r'%s[%s]' % (variable, key), value)
|
||||
return code
|
||||
86
libmproxy/contrib/jsbeautifier/unpackers/myobfuscate.py
Normal file
@@ -0,0 +1,86 @@
|
||||
#
|
||||
# deobfuscator for scripts messed up with myobfuscate.com
|
||||
# by Einar Lielmanis <einar@jsbeautifier.org>
|
||||
#
|
||||
# written by Stefano Sanfilippo <a.little.coder@gmail.com>
|
||||
#
|
||||
# usage:
|
||||
#
|
||||
# if detect(some_string):
|
||||
# unpacked = unpack(some_string)
|
||||
#
|
||||
|
||||
# CAVEAT by Einar Lielmanis
|
||||
|
||||
#
|
||||
# You really don't want to obfuscate your scripts there: they're tracking
|
||||
# your unpackings, your script gets turned into something like this,
|
||||
# as of 2011-08-26:
|
||||
#
|
||||
# var _escape = 'your_script_escaped';
|
||||
# var _111 = document.createElement('script');
|
||||
# _111.src = 'http://api.www.myobfuscate.com/?getsrc=ok' +
|
||||
# '&ref=' + encodeURIComponent(document.referrer) +
|
||||
# '&url=' + encodeURIComponent(document.URL);
|
||||
# var 000 = document.getElementsByTagName('head')[0];
|
||||
# 000.appendChild(_111);
|
||||
# document.write(unescape(_escape));
|
||||
#
|
||||
|
||||
"""Deobfuscator for scripts messed up with MyObfuscate.com"""
|
||||
|
||||
import re
|
||||
import base64
|
||||
|
||||
# Python 2 retrocompatibility
|
||||
# pylint: disable=F0401
|
||||
# pylint: disable=E0611
|
||||
try:
|
||||
from urllib import unquote
|
||||
except ImportError:
|
||||
from urllib.parse import unquote
|
||||
|
||||
from jsbeautifier.unpackers import UnpackingError
|
||||
|
||||
PRIORITY = 1
|
||||
|
||||
CAVEAT = """//
|
||||
// Unpacker warning: be careful when using myobfuscate.com for your projects:
|
||||
// scripts obfuscated by the free online version call back home.
|
||||
//
|
||||
|
||||
"""
|
||||
|
||||
SIGNATURE = (r'["\x41\x42\x43\x44\x45\x46\x47\x48\x49\x4A\x4B\x4C\x4D\x4E\x4F'
|
||||
r'\x50\x51\x52\x53\x54\x55\x56\x57\x58\x59\x5A\x61\x62\x63\x64\x65'
|
||||
r'\x66\x67\x68\x69\x6A\x6B\x6C\x6D\x6E\x6F\x70\x71\x72\x73\x74\x75'
|
||||
r'\x76\x77\x78\x79\x7A\x30\x31\x32\x33\x34\x35\x36\x37\x38\x39\x2B'
|
||||
r'\x2F\x3D","","\x63\x68\x61\x72\x41\x74","\x69\x6E\x64\x65\x78'
|
||||
r'\x4F\x66","\x66\x72\x6F\x6D\x43\x68\x61\x72\x43\x6F\x64\x65","'
|
||||
r'\x6C\x65\x6E\x67\x74\x68"]')
|
||||
|
||||
def detect(source):
|
||||
"""Detects MyObfuscate.com packer."""
|
||||
return SIGNATURE in source
|
||||
|
||||
def unpack(source):
|
||||
"""Unpacks js code packed with MyObfuscate.com"""
|
||||
if not detect(source):
|
||||
return source
|
||||
payload = unquote(_filter(source))
|
||||
match = re.search(r"^var _escape\='<script>(.*)<\/script>'",
|
||||
payload, re.DOTALL)
|
||||
polished = match.group(1) if match else source
|
||||
return CAVEAT + polished
|
||||
|
||||
def _filter(source):
|
||||
"""Extracts and decode payload (original file) from `source`"""
|
||||
try:
|
||||
varname = re.search(r'eval\(\w+\(\w+\((\w+)\)\)\);', source).group(1)
|
||||
reverse = re.search(r"var +%s *\= *'(.*)';" % varname, source).group(1)
|
||||
except AttributeError:
|
||||
raise UnpackingError('Malformed MyObfuscate data.')
|
||||
try:
|
||||
return base64.b64decode(reverse[::-1].encode('utf8')).decode('utf8')
|
||||
except TypeError:
|
||||
raise UnpackingError('MyObfuscate payload is not base64-encoded.')
|
||||
104
libmproxy/contrib/jsbeautifier/unpackers/packer.py
Normal file
@@ -0,0 +1,104 @@
|
||||
#
|
||||
# Unpacker for Dean Edward's p.a.c.k.e.r, a part of javascript beautifier
|
||||
# by Einar Lielmanis <einar@jsbeautifier.org>
|
||||
#
|
||||
# written by Stefano Sanfilippo <a.little.coder@gmail.com>
|
||||
#
|
||||
# usage:
|
||||
#
|
||||
# if detect(some_string):
|
||||
# unpacked = unpack(some_string)
|
||||
#
|
||||
|
||||
"""Unpacker for Dean Edward's p.a.c.k.e.r"""
|
||||
|
||||
import re
|
||||
import string
|
||||
from jsbeautifier.unpackers import UnpackingError
|
||||
|
||||
PRIORITY = 1
|
||||
|
||||
def detect(source):
|
||||
"""Detects whether `source` is P.A.C.K.E.R. coded."""
|
||||
return source.replace(' ', '').startswith('eval(function(p,a,c,k,e,r')
|
||||
|
||||
def unpack(source):
|
||||
"""Unpacks P.A.C.K.E.R. packed js code."""
|
||||
payload, symtab, radix, count = _filterargs(source)
|
||||
|
||||
if count != len(symtab):
|
||||
raise UnpackingError('Malformed p.a.c.k.e.r. symtab.')
|
||||
|
||||
try:
|
||||
unbase = Unbaser(radix)
|
||||
except TypeError:
|
||||
raise UnpackingError('Unknown p.a.c.k.e.r. encoding.')
|
||||
|
||||
def lookup(match):
|
||||
"""Look up symbols in the synthetic symtab."""
|
||||
word = match.group(0)
|
||||
return symtab[unbase(word)] or word
|
||||
|
||||
source = re.sub(r'\b\w+\b', lookup, payload)
|
||||
return _replacestrings(source)
|
||||
|
||||
def _filterargs(source):
|
||||
"""Juice from a source file the four args needed by decoder."""
|
||||
argsregex = (r"}\('(.*)', *(\d+), *(\d+), *'(.*)'\."
|
||||
r"split\('\|'\), *(\d+), *(.*)\)\)")
|
||||
args = re.search(argsregex, source, re.DOTALL).groups()
|
||||
|
||||
try:
|
||||
return args[0], args[3].split('|'), int(args[1]), int(args[2])
|
||||
except ValueError:
|
||||
raise UnpackingError('Corrupted p.a.c.k.e.r. data.')
|
||||
|
||||
def _replacestrings(source):
|
||||
"""Strip string lookup table (list) and replace values in source."""
|
||||
match = re.search(r'var *(_\w+)\=\["(.*?)"\];', source, re.DOTALL)
|
||||
|
||||
if match:
|
||||
varname, strings = match.groups()
|
||||
startpoint = len(match.group(0))
|
||||
lookup = strings.split('","')
|
||||
variable = '%s[%%d]' % varname
|
||||
for index, value in enumerate(lookup):
|
||||
source = source.replace(variable % index, '"%s"' % value)
|
||||
return source[startpoint:]
|
||||
return source
|
||||
|
||||
|
||||
class Unbaser(object):
|
||||
"""Functor for a given base. Will efficiently convert
|
||||
strings to natural numbers."""
|
||||
ALPHABET = {
|
||||
62 : '0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ',
|
||||
95 : (' !"#$%&\'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ'
|
||||
'[\]^_`abcdefghijklmnopqrstuvwxyz{|}~')
|
||||
}
|
||||
|
||||
def __init__(self, base):
|
||||
self.base = base
|
||||
|
||||
# If base can be handled by int() builtin, let it do it for us
|
||||
if 2 <= base <= 36:
|
||||
self.unbase = lambda string: int(string, base)
|
||||
else:
|
||||
# Build conversion dictionary cache
|
||||
try:
|
||||
self.dictionary = dict((cipher, index) for
|
||||
index, cipher in enumerate(self.ALPHABET[base]))
|
||||
except KeyError:
|
||||
raise TypeError('Unsupported base encoding.')
|
||||
|
||||
self.unbase = self._dictunbaser
|
||||
|
||||
def __call__(self, string):
|
||||
return self.unbase(string)
|
||||
|
||||
def _dictunbaser(self, string):
|
||||
"""Decodes a value to an integer."""
|
||||
ret = 0
|
||||
for index, cipher in enumerate(string[::-1]):
|
||||
ret += (self.base ** index) * self.dictionary[cipher]
|
||||
return ret
|
||||
34
libmproxy/contrib/jsbeautifier/unpackers/urlencode.py
Normal file
@@ -0,0 +1,34 @@
|
||||
#
|
||||
# Trivial bookmarklet/escaped script detector for the javascript beautifier
|
||||
# written by Einar Lielmanis <einar@jsbeautifier.org>
|
||||
# rewritten in Python by Stefano Sanfilippo <a.little.coder@gmail.com>
|
||||
#
|
||||
# Will always return valid javascript: if `detect()` is false, `code` is
|
||||
# returned, unmodified.
|
||||
#
|
||||
# usage:
|
||||
#
|
||||
# some_string = urlencode.unpack(some_string)
|
||||
#
|
||||
|
||||
"""Bookmarklet/escaped script unpacker."""
|
||||
|
||||
# Python 2 retrocompatibility
|
||||
# pylint: disable=F0401
|
||||
# pylint: disable=E0611
|
||||
try:
|
||||
from urllib import unquote_plus
|
||||
except ImportError:
|
||||
from urllib.parse import unquote_plus
|
||||
|
||||
PRIORITY = 0
|
||||
|
||||
def detect(code):
|
||||
"""Detects if a scriptlet is urlencoded."""
|
||||
# the fact that script doesn't contain any space, but has %20 instead
|
||||
# should be sufficient check for now.
|
||||
return ' ' not in code and ('%20' in code or code.count('%') > 3)
|
||||
|
||||
def unpack(code):
|
||||
"""URL decode `code` source string."""
|
||||
return unquote_plus(code) if detect(code) else code
|
||||
@@ -83,9 +83,7 @@ class Master:
|
||||
def run(self):
|
||||
global should_exit
|
||||
should_exit = False
|
||||
if self.server:
|
||||
slave = Slave(self.masterq, self.server)
|
||||
slave.start()
|
||||
self.server.start_slave(Slave, self.masterq)
|
||||
while not should_exit:
|
||||
self.tick(self.masterq)
|
||||
self.shutdown()
|
||||
|
||||
@@ -28,7 +28,9 @@ class Options(object):
|
||||
"keepserving",
|
||||
"kill",
|
||||
"no_server",
|
||||
"nopop",
|
||||
"refresh_server_playback",
|
||||
"replacements",
|
||||
"rfile",
|
||||
"rheaders",
|
||||
"server_replay",
|
||||
@@ -93,11 +95,16 @@ class DumpMaster(flow.FlowMaster):
|
||||
except IOError, v:
|
||||
raise DumpError(v.strerror)
|
||||
|
||||
if options.replacements:
|
||||
for i in options.replacements:
|
||||
self.replacehooks.add(*i)
|
||||
|
||||
if options.server_replay:
|
||||
self.start_server_playback(
|
||||
self._readflow(options.server_replay),
|
||||
options.kill, options.rheaders,
|
||||
not options.keepserving
|
||||
not options.keepserving,
|
||||
options.nopop
|
||||
)
|
||||
|
||||
if options.client_replay:
|
||||
@@ -139,7 +146,6 @@ class DumpMaster(flow.FlowMaster):
|
||||
|
||||
def handle_request(self, r):
|
||||
f = flow.FlowMaster.handle_request(self, r)
|
||||
self.add_event("Request: %s"%str_request(r))
|
||||
if f:
|
||||
r._ack()
|
||||
return f
|
||||
|
||||
@@ -40,7 +40,6 @@
|
||||
"""
|
||||
import re, sys
|
||||
import contrib.pyparsing as pp
|
||||
import flow
|
||||
|
||||
|
||||
class _Token:
|
||||
@@ -87,6 +86,7 @@ class _Rex(_Action):
|
||||
except:
|
||||
raise ValueError, "Cannot compile expression."
|
||||
|
||||
|
||||
def _check_content_type(expr, o):
|
||||
val = o.headers["content-type"]
|
||||
if val and re.search(expr, val[0]):
|
||||
|
||||
@@ -21,11 +21,68 @@ import hashlib, Cookie, cookielib, copy, re, urlparse
|
||||
import time
|
||||
import tnetstring, filt, script, utils, encoding, proxy
|
||||
from email.utils import parsedate_tz, formatdate, mktime_tz
|
||||
import controller, version
|
||||
import controller, version, certutils
|
||||
|
||||
HDR_FORM_URLENCODED = "application/x-www-form-urlencoded"
|
||||
|
||||
|
||||
class ReplaceHooks:
|
||||
def __init__(self):
|
||||
self.lst = []
|
||||
|
||||
def add(self, fpatt, rex, s):
|
||||
"""
|
||||
Add a replacement hook.
|
||||
|
||||
fpatt: A string specifying a filter pattern.
|
||||
rex: A regular expression.
|
||||
s: The replacement string
|
||||
|
||||
Returns True if hook was added, False if the pattern could not be
|
||||
parsed.
|
||||
"""
|
||||
cpatt = filt.parse(fpatt)
|
||||
if not cpatt:
|
||||
return False
|
||||
try:
|
||||
re.compile(rex)
|
||||
except re.error:
|
||||
return False
|
||||
self.lst.append((fpatt, rex, s, cpatt))
|
||||
return True
|
||||
|
||||
def remove(self, fpatt, rex, s):
|
||||
"""
|
||||
Remove a hook.
|
||||
|
||||
patt: A string specifying a filter pattern.
|
||||
func: Optional callable. If not specified, all hooks matching patt are removed.
|
||||
"""
|
||||
for i in range(len(self.lst)-1, -1, -1):
|
||||
if (fpatt, rex, s) == self.lst[i][:3]:
|
||||
del self.lst[i]
|
||||
|
||||
def get_specs(self):
|
||||
"""
|
||||
Retrieve the hook specifcations. Returns a list of (fpatt, rex, s) tuples.
|
||||
"""
|
||||
return [i[:3] for i in self.lst]
|
||||
|
||||
def count(self):
|
||||
return len(self.lst)
|
||||
|
||||
def run(self, f):
|
||||
for _, rex, s, cpatt in self.lst:
|
||||
if cpatt(f):
|
||||
if f.response:
|
||||
f.response.replace(rex, s)
|
||||
else:
|
||||
f.request.replace(rex, s)
|
||||
|
||||
def clear(self):
|
||||
self.lst = []
|
||||
|
||||
|
||||
class ScriptContext:
|
||||
def __init__(self, master):
|
||||
self._master = master
|
||||
@@ -75,7 +132,6 @@ class ODict:
|
||||
def __getitem__(self, k):
|
||||
"""
|
||||
Returns a list of values matching key.
|
||||
|
||||
"""
|
||||
ret = []
|
||||
k = self._kconv(k)
|
||||
@@ -125,6 +181,12 @@ class ODict:
|
||||
def add(self, key, value):
|
||||
self.lst.append([key, str(value)])
|
||||
|
||||
def get(self, k, d=None):
|
||||
if k in self:
|
||||
return self[k]
|
||||
else:
|
||||
return d
|
||||
|
||||
def _get_state(self):
|
||||
return [tuple(i) for i in self.lst]
|
||||
|
||||
@@ -176,8 +238,11 @@ class ODict:
|
||||
|
||||
def replace(self, pattern, repl, *args, **kwargs):
|
||||
"""
|
||||
Replaces a regular expression pattern with repl in both keys
|
||||
and values. Returns the number of replacements made.
|
||||
Replaces a regular expression pattern with repl in both keys and
|
||||
values. Encoded content will be decoded before replacement, and
|
||||
re-encoded afterwards.
|
||||
|
||||
Returns the number of replacements made.
|
||||
"""
|
||||
nlst, count = [], 0
|
||||
for i in self.lst:
|
||||
@@ -199,6 +264,34 @@ class ODictCaseless(ODict):
|
||||
return s.lower()
|
||||
|
||||
|
||||
class decoded(object):
|
||||
"""
|
||||
|
||||
A context manager that decodes a request, response or error, and then
|
||||
re-encodes it with the same encoding after execution of the block.
|
||||
|
||||
Example:
|
||||
|
||||
with decoded(request):
|
||||
request.content = request.content.replace("foo", "bar")
|
||||
"""
|
||||
def __init__(self, o):
|
||||
self.o = o
|
||||
ce = o.headers["content-encoding"]
|
||||
if ce and ce[0] in encoding.ENCODINGS:
|
||||
self.ce = ce[0]
|
||||
else:
|
||||
self.ce = None
|
||||
|
||||
def __enter__(self):
|
||||
if self.ce:
|
||||
self.o.decode()
|
||||
|
||||
def __exit__(self, type, value, tb):
|
||||
if self.ce:
|
||||
self.o.encode(self.ce)
|
||||
|
||||
|
||||
class HTTPMsg(controller.Msg):
|
||||
def decode(self):
|
||||
"""
|
||||
@@ -231,7 +324,7 @@ class Request(HTTPMsg):
|
||||
|
||||
Exposes the following attributes:
|
||||
|
||||
client_conn: ClientConnection object, or None if this is a replay.
|
||||
client_conn: ClientConnect object, or None if this is a replay.
|
||||
headers: ODictCaseless object
|
||||
content: Content of the request, or None
|
||||
|
||||
@@ -447,10 +540,13 @@ class Request(HTTPMsg):
|
||||
def replace(self, pattern, repl, *args, **kwargs):
|
||||
"""
|
||||
Replaces a regular expression pattern with repl in both the headers
|
||||
and the body of the request. Returns the number of replacements
|
||||
made.
|
||||
and the body of the request. Encoded content will be decoded before
|
||||
replacement, and re-encoded afterwards.
|
||||
|
||||
Returns the number of replacements made.
|
||||
"""
|
||||
self.content, c = re.subn(pattern, repl, self.content, *args, **kwargs)
|
||||
with decoded(self):
|
||||
self.content, c = re.subn(pattern, repl, self.content, *args, **kwargs)
|
||||
self.path, pc = re.subn(pattern, repl, self.path, *args, **kwargs)
|
||||
c += pc
|
||||
c += self.headers.replace(pattern, repl, *args, **kwargs)
|
||||
@@ -470,11 +566,12 @@ class Response(HTTPMsg):
|
||||
content: Response content
|
||||
timestamp: Seconds since the epoch
|
||||
"""
|
||||
def __init__(self, request, code, msg, headers, content, timestamp=None):
|
||||
def __init__(self, request, code, msg, headers, content, der_cert, timestamp=None):
|
||||
assert isinstance(headers, ODictCaseless)
|
||||
self.request = request
|
||||
self.code, self.msg = code, msg
|
||||
self.headers, self.content = headers, content
|
||||
self.der_cert = der_cert
|
||||
self.timestamp = timestamp or utils.timestamp()
|
||||
controller.Msg.__init__(self)
|
||||
self.replay = False
|
||||
@@ -544,6 +641,14 @@ class Response(HTTPMsg):
|
||||
self.headers = ODictCaseless._from_state(state["headers"])
|
||||
self.content = state["content"]
|
||||
self.timestamp = state["timestamp"]
|
||||
self.der_cert = state["der_cert"]
|
||||
|
||||
def get_cert(self):
|
||||
"""
|
||||
Returns a certutils.SSLCert object, or None.
|
||||
"""
|
||||
if self.der_cert:
|
||||
return certutils.SSLCert.from_der(self.der_cert)
|
||||
|
||||
def _get_state(self):
|
||||
return dict(
|
||||
@@ -551,6 +656,7 @@ class Response(HTTPMsg):
|
||||
msg = self.msg,
|
||||
headers = self.headers._get_state(),
|
||||
timestamp = self.timestamp,
|
||||
der_cert = self.der_cert,
|
||||
content = self.content
|
||||
)
|
||||
|
||||
@@ -562,6 +668,7 @@ class Response(HTTPMsg):
|
||||
str(state["msg"]),
|
||||
ODictCaseless._from_state(state["headers"]),
|
||||
state["content"],
|
||||
state.get("der_cert"),
|
||||
state["timestamp"],
|
||||
)
|
||||
|
||||
@@ -602,10 +709,13 @@ class Response(HTTPMsg):
|
||||
def replace(self, pattern, repl, *args, **kwargs):
|
||||
"""
|
||||
Replaces a regular expression pattern with repl in both the headers
|
||||
and the body of the response. Returns the number of replacements
|
||||
made.
|
||||
and the body of the response. Encoded content will be decoded
|
||||
before replacement, and re-encoded afterwards.
|
||||
|
||||
Returns the number of replacements made.
|
||||
"""
|
||||
self.content, c = re.subn(pattern, repl, self.content, *args, **kwargs)
|
||||
with decoded(self):
|
||||
self.content, c = re.subn(pattern, repl, self.content, *args, **kwargs)
|
||||
c += self.headers.replace(pattern, repl, *args, **kwargs)
|
||||
return c
|
||||
|
||||
@@ -650,15 +760,21 @@ class ClientConnect(controller.Msg):
|
||||
return self._get_state() == other._get_state()
|
||||
|
||||
def _load_state(self, state):
|
||||
self.address = state
|
||||
self.close = True
|
||||
self.requestcount = state["requestcount"]
|
||||
|
||||
def _get_state(self):
|
||||
return list(self.address) if self.address else None
|
||||
return dict(
|
||||
address = list(self.address),
|
||||
requestcount = self.requestcount,
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def _from_state(klass, state):
|
||||
if state:
|
||||
return klass(state)
|
||||
k = klass(state["address"])
|
||||
k._load_state(state)
|
||||
return k
|
||||
else:
|
||||
return None
|
||||
|
||||
@@ -725,6 +841,8 @@ class Error(controller.Msg):
|
||||
Replaces a regular expression pattern with repl in both the headers
|
||||
and the body of the request. Returns the number of replacements
|
||||
made.
|
||||
|
||||
FIXME: Is replace useful on an Error object??
|
||||
"""
|
||||
self.msg, c = re.subn(pattern, repl, self.msg, *args, **kwargs)
|
||||
return c
|
||||
@@ -768,12 +886,12 @@ class ClientPlaybackState:
|
||||
|
||||
|
||||
class ServerPlaybackState:
|
||||
def __init__(self, headers, flows, exit):
|
||||
def __init__(self, headers, flows, exit, nopop):
|
||||
"""
|
||||
headers: Case-insensitive list of request headers that should be
|
||||
included in request-response matching.
|
||||
"""
|
||||
self.headers, self.exit = headers, exit
|
||||
self.headers, self.exit, self.nopop = headers, exit, nopop
|
||||
self.fmap = {}
|
||||
for i in flows:
|
||||
if i.response:
|
||||
@@ -815,7 +933,12 @@ class ServerPlaybackState:
|
||||
l = self.fmap.get(self._hash(request))
|
||||
if not l:
|
||||
return None
|
||||
return l.pop(0)
|
||||
|
||||
if self.nopop:
|
||||
return l[0]
|
||||
else:
|
||||
return l.pop(0)
|
||||
|
||||
|
||||
|
||||
class StickyCookieState:
|
||||
@@ -925,21 +1048,16 @@ class Flow:
|
||||
f._load_state(state)
|
||||
return f
|
||||
|
||||
def _get_state(self, nobackup=False):
|
||||
def _get_state(self):
|
||||
d = dict(
|
||||
request = self.request._get_state() if self.request else None,
|
||||
response = self.response._get_state() if self.response else None,
|
||||
error = self.error._get_state() if self.error else None,
|
||||
version = version.IVERSION
|
||||
)
|
||||
if nobackup:
|
||||
d["backup"] = None
|
||||
else:
|
||||
d["backup"] = self._backup
|
||||
return d
|
||||
|
||||
def _load_state(self, state):
|
||||
self._backup = state["backup"]
|
||||
if self.request:
|
||||
self.request._load_state(state["request"])
|
||||
else:
|
||||
@@ -972,12 +1090,13 @@ class Flow:
|
||||
else:
|
||||
return False
|
||||
|
||||
def backup(self):
|
||||
def backup(self, force=False):
|
||||
"""
|
||||
Save a backup of this Flow, which can be reverted to using a
|
||||
call to .revert().
|
||||
"""
|
||||
self._backup = self._get_state(nobackup=True)
|
||||
if not self._backup:
|
||||
self._backup = self._get_state()
|
||||
|
||||
def revert(self):
|
||||
"""
|
||||
@@ -1029,7 +1148,10 @@ class Flow:
|
||||
def replace(self, pattern, repl, *args, **kwargs):
|
||||
"""
|
||||
Replaces a regular expression pattern with repl in all parts of the
|
||||
flow . Returns the number of replacements made.
|
||||
flow. Encoded content will be decoded before replacement, and
|
||||
re-encoded afterwards.
|
||||
|
||||
Returns the number of replacements made.
|
||||
"""
|
||||
c = self.request.replace(pattern, repl, *args, **kwargs)
|
||||
if self.response:
|
||||
@@ -1182,6 +1304,7 @@ class FlowMaster(controller.Master):
|
||||
self.anticache = False
|
||||
self.anticomp = False
|
||||
self.refresh_server_playback = False
|
||||
self.replacehooks = ReplaceHooks()
|
||||
|
||||
def add_event(self, e, level="info"):
|
||||
"""
|
||||
@@ -1251,12 +1374,12 @@ class FlowMaster(controller.Master):
|
||||
def stop_client_playback(self):
|
||||
self.client_playback = None
|
||||
|
||||
def start_server_playback(self, flows, kill, headers, exit):
|
||||
def start_server_playback(self, flows, kill, headers, exit, nopop):
|
||||
"""
|
||||
flows: List of flows.
|
||||
kill: Boolean, should we kill requests not part of the replay?
|
||||
"""
|
||||
self.server_playback = ServerPlaybackState(headers, flows, exit)
|
||||
self.server_playback = ServerPlaybackState(headers, flows, exit, nopop)
|
||||
self.kill_nonreplay = kill
|
||||
|
||||
def stop_server_playback(self):
|
||||
@@ -1391,6 +1514,7 @@ class FlowMaster(controller.Master):
|
||||
|
||||
def handle_error(self, r):
|
||||
f = self.state.add_error(r)
|
||||
self.replacehooks.run(f)
|
||||
if f:
|
||||
self.run_script_hook("error", f)
|
||||
if self.client_playback:
|
||||
@@ -1400,12 +1524,14 @@ class FlowMaster(controller.Master):
|
||||
|
||||
def handle_request(self, r):
|
||||
f = self.state.add_request(r)
|
||||
self.replacehooks.run(f)
|
||||
self.run_script_hook("request", f)
|
||||
self.process_new_request(f)
|
||||
return f
|
||||
|
||||
def handle_response(self, r):
|
||||
f = self.state.add_response(r)
|
||||
self.replacehooks.run(f)
|
||||
if f:
|
||||
self.run_script_hook("response", f)
|
||||
if self.client_playback:
|
||||
|
||||
@@ -21,7 +21,7 @@
|
||||
import sys, os, string, socket, time
|
||||
import shutil, tempfile, threading
|
||||
import optparse, SocketServer, ssl
|
||||
import utils, flow
|
||||
import utils, flow, certutils
|
||||
|
||||
NAME = "mitmproxy"
|
||||
|
||||
@@ -35,12 +35,13 @@ class ProxyError(Exception):
|
||||
|
||||
|
||||
class ProxyConfig:
|
||||
def __init__(self, certfile = None, ciphers = None, cacert = None, cert_wait_time=0, body_size_limit = None, reverse_proxy=None):
|
||||
def __init__(self, certfile = None, ciphers = None, cacert = None, cert_wait_time=0, upstream_cert=False, body_size_limit = None, reverse_proxy=None):
|
||||
self.certfile = certfile
|
||||
self.ciphers = ciphers
|
||||
self.cacert = cacert
|
||||
self.certdir = None
|
||||
self.cert_wait_time = cert_wait_time
|
||||
self.upstream_cert = upstream_cert
|
||||
self.body_size_limit = body_size_limit
|
||||
self.reverse_proxy = reverse_proxy
|
||||
|
||||
@@ -106,7 +107,7 @@ def read_chunked(fp, limit):
|
||||
|
||||
def read_http_body(rfile, connection, headers, all, limit):
|
||||
if 'transfer-encoding' in headers:
|
||||
if not ",".join(headers["transfer-encoding"]) == "chunked":
|
||||
if not ",".join(headers["transfer-encoding"]).lower() == "chunked":
|
||||
raise IOError('Invalid transfer-encoding')
|
||||
content = read_chunked(rfile, limit)
|
||||
elif "content-length" in headers:
|
||||
@@ -184,10 +185,14 @@ class FileLike:
|
||||
result += data
|
||||
return result
|
||||
|
||||
def readline(self):
|
||||
def readline(self, size = None):
|
||||
result = ''
|
||||
bytes_read = 0
|
||||
while True:
|
||||
if size is not None and bytes_read >= size:
|
||||
break
|
||||
ch = self.read(1)
|
||||
bytes_read += 1
|
||||
if not ch:
|
||||
break
|
||||
else:
|
||||
@@ -224,6 +229,7 @@ class ServerConnection:
|
||||
self.port = request.port
|
||||
self.scheme = request.scheme
|
||||
self.close = False
|
||||
self.cert = None
|
||||
self.server, self.rfile, self.wfile = None, None, None
|
||||
self.connect()
|
||||
|
||||
@@ -234,6 +240,8 @@ class ServerConnection:
|
||||
if self.scheme == "https":
|
||||
server = ssl.wrap_socket(server)
|
||||
server.connect((addr, self.port))
|
||||
if self.scheme == "https":
|
||||
self.cert = server.getpeercert(True)
|
||||
except socket.error, err:
|
||||
raise ProxyError(502, 'Error connecting to "%s": %s' % (self.host, err))
|
||||
self.server = server
|
||||
@@ -270,7 +278,7 @@ class ServerConnection:
|
||||
content = ""
|
||||
else:
|
||||
content = read_http_body(self.rfile, self, headers, True, self.config.body_size_limit)
|
||||
return flow.Response(self.request, code, msg, headers, content)
|
||||
return flow.Response(self.request, code, msg, headers, content, self.cert)
|
||||
|
||||
def terminate(self):
|
||||
try:
|
||||
@@ -343,11 +351,16 @@ class ProxyHandler(SocketServer.StreamRequestHandler):
|
||||
if server:
|
||||
server.terminate()
|
||||
|
||||
def find_cert(self, host):
|
||||
def find_cert(self, host, port):
|
||||
if self.config.certfile:
|
||||
return self.config.certfile
|
||||
else:
|
||||
ret = utils.dummy_cert(self.config.certdir, self.config.cacert, host)
|
||||
sans = []
|
||||
if self.config.upstream_cert:
|
||||
cert = certutils.get_remote_cert(host, port)
|
||||
sans = cert.altnames
|
||||
host = cert.cn
|
||||
ret = certutils.dummy_cert(self.config.certdir, self.config.cacert, host, sans)
|
||||
time.sleep(self.config.cert_wait_time)
|
||||
if not ret:
|
||||
raise ProxyError(502, "mitmproxy: Unable to generate dummy cert.")
|
||||
@@ -374,7 +387,7 @@ class ProxyHandler(SocketServer.StreamRequestHandler):
|
||||
)
|
||||
self.wfile.flush()
|
||||
kwargs = dict(
|
||||
certfile = self.find_cert(host),
|
||||
certfile = self.find_cert(host, port),
|
||||
keyfile = self.config.certfile or self.config.cacert,
|
||||
server_side = True,
|
||||
ssl_version = ssl.PROTOCOL_SSLv23,
|
||||
@@ -468,6 +481,7 @@ ServerBase.daemon_threads = True # Terminate workers when main thread ter
|
||||
class ProxyServer(ServerBase):
|
||||
request_queue_size = 20
|
||||
allow_reuse_address = True
|
||||
bound = True
|
||||
def __init__(self, config, port, address=''):
|
||||
"""
|
||||
Raises ProxyServerError if there's a startup problem.
|
||||
@@ -481,6 +495,10 @@ class ProxyServer(ServerBase):
|
||||
self.certdir = tempfile.mkdtemp(prefix="mitmproxy")
|
||||
config.certdir = self.certdir
|
||||
|
||||
def start_slave(self, klass, masterq):
|
||||
slave = klass(masterq, self)
|
||||
slave.start()
|
||||
|
||||
def set_mqueue(self, q):
|
||||
self.masterq = q
|
||||
|
||||
@@ -495,6 +513,18 @@ class ProxyServer(ServerBase):
|
||||
pass
|
||||
|
||||
|
||||
class DummyServer:
|
||||
bound = False
|
||||
def __init__(self, config):
|
||||
self.config = config
|
||||
|
||||
def start_slave(self, klass, masterq):
|
||||
pass
|
||||
|
||||
def shutdown(self):
|
||||
pass
|
||||
|
||||
|
||||
# Command-line utils
|
||||
def certificate_option_group(parser):
|
||||
group = optparse.OptionGroup(parser, "SSL")
|
||||
@@ -520,7 +550,7 @@ def process_proxy_options(parser, options):
|
||||
cacert = os.path.join(options.confdir, "mitmproxy-ca.pem")
|
||||
cacert = os.path.expanduser(cacert)
|
||||
if not os.path.exists(cacert):
|
||||
utils.dummy_ca(cacert)
|
||||
certutils.dummy_ca(cacert)
|
||||
if getattr(options, "cache", None) is not None:
|
||||
options.cache = os.path.expanduser(options.cache)
|
||||
body_size_limit = utils.parse_size(options.body_size_limit)
|
||||
@@ -538,5 +568,6 @@ def process_proxy_options(parser, options):
|
||||
ciphers = options.ciphers,
|
||||
cert_wait_time = options.cert_wait_time,
|
||||
body_size_limit = body_size_limit,
|
||||
upstream_cert = options.upstream_cert,
|
||||
reverse_proxy = rp
|
||||
)
|
||||
|
||||
@@ -1,33 +0,0 @@
|
||||
[ req ]
|
||||
prompt = no
|
||||
distinguished_name = req_distinguished_name
|
||||
x509_extensions = v3_ca
|
||||
req_extensions = v3_ca_req
|
||||
|
||||
[ req_distinguished_name ]
|
||||
organizationName = mitmproxy
|
||||
commonName = mitmproxy
|
||||
|
||||
[ v3_ca ]
|
||||
basicConstraints = critical,CA:true
|
||||
keyUsage = cRLSign, keyCertSign
|
||||
extendedKeyUsage=serverAuth,clientAuth,emailProtection,timeStamping,msCodeInd,msCodeCom,msCTLSign,msSGC,msEFS,nsSGC
|
||||
nsCertType = sslCA
|
||||
|
||||
[ v3_ca_req ]
|
||||
basicConstraints = critical,CA:true
|
||||
keyUsage = cRLSign, keyCertSign
|
||||
extendedKeyUsage=serverAuth,clientAuth,emailProtection,timeStamping,msCodeInd,msCodeCom,msCTLSign,msSGC,msEFS,nsSGC
|
||||
nsCertType = sslCA
|
||||
|
||||
[ v3_cert ]
|
||||
basicConstraints = CA:false
|
||||
keyUsage = nonRepudiation, digitalSignature, keyEncipherment
|
||||
extendedKeyUsage=serverAuth,clientAuth,emailProtection,timeStamping,msCodeInd,msCodeCom,msCTLSign,msSGC,msEFS,nsSGC
|
||||
nsCertType = server
|
||||
|
||||
[ v3_cert_req ]
|
||||
basicConstraints = CA:false
|
||||
keyUsage = nonRepudiation, digitalSignature, keyEncipherment
|
||||
extendedKeyUsage=serverAuth,clientAuth,emailProtection,timeStamping,msCodeInd,msCodeCom,msCTLSign,msSGC,msEFS,nsSGC
|
||||
nsCertType = server
|
||||
@@ -1,30 +0,0 @@
|
||||
[ req ]
|
||||
prompt = no
|
||||
distinguished_name = req_distinguished_name
|
||||
x509_extensions = v3_cert
|
||||
req_extensions = v3_cert_req
|
||||
|
||||
[ req_distinguished_name ]
|
||||
organizationName = mitmproxy
|
||||
commonName = %(commonname)s
|
||||
|
||||
[ v3_ca ]
|
||||
basicConstraints = critical,CA:true
|
||||
keyUsage = cRLSign, keyCertSign
|
||||
nsCertType = sslCA
|
||||
|
||||
[ v3_ca_req ]
|
||||
basicConstraints = critical,CA:true
|
||||
keyUsage = cRLSign, keyCertSign
|
||||
nsCertType = sslCA
|
||||
|
||||
[ v3_cert ]
|
||||
basicConstraints = CA:false
|
||||
keyUsage = nonRepudiation, digitalSignature, keyEncipherment
|
||||
nsCertType = server
|
||||
|
||||
[ v3_cert_req ]
|
||||
basicConstraints = CA:false
|
||||
keyUsage = nonRepudiation, digitalSignature, keyEncipherment
|
||||
nsCertType = server
|
||||
|
||||
@@ -57,12 +57,12 @@ functions. They're only here so you can use load() to read precisely one
|
||||
item from a file or socket without consuming any extra data.
|
||||
|
||||
By default tnetstrings work only with byte strings, not unicode. If you want
|
||||
unicode strings then pass an optional encoding to the various functions,
|
||||
unicode strings then pass an optional encoding to the various functions,
|
||||
like so::
|
||||
|
||||
>>> print repr(tnetstring.loads("2:\\xce\\xb1,"))
|
||||
'\\xce\\xb1'
|
||||
>>>
|
||||
>>>
|
||||
>>> print repr(tnetstring.loads("2:\\xce\\xb1,","utf8"))
|
||||
u'\u03b1'
|
||||
|
||||
@@ -129,7 +129,7 @@ def _rdumpq(q,size,value,encoding=None):
|
||||
write("5:false!")
|
||||
return size + 8
|
||||
if isinstance(value,(int,long)):
|
||||
data = str(value)
|
||||
data = str(value)
|
||||
ldata = len(data)
|
||||
span = str(ldata)
|
||||
write("#")
|
||||
@@ -142,7 +142,7 @@ def _rdumpq(q,size,value,encoding=None):
|
||||
# It round-trips more accurately.
|
||||
# Probably unnecessary in later python versions that
|
||||
# use David Gay's ftoa routines.
|
||||
data = repr(value)
|
||||
data = repr(value)
|
||||
ldata = len(data)
|
||||
span = str(ldata)
|
||||
write("^")
|
||||
@@ -207,13 +207,13 @@ def _gdumps(value,encoding):
|
||||
elif value is False:
|
||||
yield "5:false!"
|
||||
elif isinstance(value,(int,long)):
|
||||
data = str(value)
|
||||
data = str(value)
|
||||
yield str(len(data))
|
||||
yield ":"
|
||||
yield data
|
||||
yield "#"
|
||||
elif isinstance(value,(float,)):
|
||||
data = repr(value)
|
||||
data = repr(value)
|
||||
yield str(len(data))
|
||||
yield ":"
|
||||
yield data
|
||||
@@ -334,7 +334,7 @@ def load(file,encoding=None):
|
||||
d[key] = val
|
||||
return d
|
||||
raise ValueError("unknown type tag")
|
||||
|
||||
|
||||
|
||||
|
||||
def pop(string,encoding=None):
|
||||
|
||||
@@ -12,13 +12,10 @@
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
import re, os, subprocess, datetime, urlparse, string, urllib
|
||||
import time, functools, cgi, textwrap, hashlib
|
||||
import os, datetime, urlparse, string, urllib
|
||||
import time, functools, cgi
|
||||
import json
|
||||
|
||||
CERT_SLEEP_TIME = 1
|
||||
CERT_EXPIRY = str(365 * 3)
|
||||
|
||||
def timestamp():
|
||||
"""
|
||||
Returns a serializable UTC timestamp.
|
||||
@@ -57,63 +54,24 @@ def isXML(s):
|
||||
return False
|
||||
|
||||
|
||||
def cleanBin(s):
|
||||
def cleanBin(s, fixspacing=False):
|
||||
"""
|
||||
Cleans binary data to make it safe to display. If fixspacing is True,
|
||||
tabs, newlines and so forth will be maintained, if not, they will be
|
||||
replaced with a placeholder.
|
||||
"""
|
||||
parts = []
|
||||
for i in s:
|
||||
o = ord(i)
|
||||
if o > 31 and o < 127:
|
||||
if (o > 31 and o < 127):
|
||||
parts.append(i)
|
||||
elif i in "\n\r\t" and not fixspacing:
|
||||
parts.append(i)
|
||||
else:
|
||||
if i not in "\n\r\t":
|
||||
parts.append(".")
|
||||
parts.append(".")
|
||||
return "".join(parts)
|
||||
|
||||
|
||||
TAG = r"""
|
||||
<\s*
|
||||
(?!\s*[!"])
|
||||
(?P<close>\s*\/)?
|
||||
(?P<name>\w+)
|
||||
(
|
||||
[^'"\t >]+ |
|
||||
"[^\"]*"['\"]* |
|
||||
'[^']*'['\"]* |
|
||||
\s+
|
||||
)*
|
||||
(?P<selfcont>\s*\/\s*)?
|
||||
\s*>
|
||||
"""
|
||||
UNI = set(["br", "hr", "img", "input", "area", "link"])
|
||||
INDENT = " "*4
|
||||
def pretty_xmlish(s):
|
||||
"""
|
||||
A robust pretty-printer for XML-ish data.
|
||||
Returns a list of lines.
|
||||
"""
|
||||
s = cleanBin(s)
|
||||
data, offset, indent, prev = [], 0, 0, None
|
||||
for i in re.finditer(TAG, s, re.VERBOSE|re.MULTILINE):
|
||||
start, end = i.span()
|
||||
name = i.group("name")
|
||||
if start > offset:
|
||||
txt = []
|
||||
for x in textwrap.dedent(s[offset:start]).split("\n"):
|
||||
if x.strip():
|
||||
txt.append(indent*INDENT + x)
|
||||
data.extend(txt)
|
||||
if i.group("close") and not (name in UNI and name==prev):
|
||||
indent = max(indent - 1, 0)
|
||||
data.append(indent*INDENT + i.group().strip())
|
||||
offset = end
|
||||
if not any([i.group("close"), i.group("selfcont"), name in UNI]):
|
||||
indent += 1
|
||||
prev = name
|
||||
trail = s[offset:]
|
||||
if trail.strip():
|
||||
data.append(s[offset:])
|
||||
return data
|
||||
|
||||
|
||||
def pretty_json(s):
|
||||
try:
|
||||
p = json.loads(s)
|
||||
@@ -133,6 +91,7 @@ def urlencode(s):
|
||||
"""
|
||||
Takes a list of (key, value) tuples and returns a urlencoded string.
|
||||
"""
|
||||
s = [tuple(i) for i in s]
|
||||
return urllib.urlencode(s, False)
|
||||
|
||||
|
||||
@@ -150,7 +109,7 @@ def hexdump(s):
|
||||
x += " "
|
||||
x += " ".join(" " for i in range(16 - len(part)))
|
||||
parts.append(
|
||||
(o, x, cleanBin(part))
|
||||
(o, x, cleanBin(part, True))
|
||||
)
|
||||
return parts
|
||||
|
||||
@@ -197,166 +156,6 @@ class Data:
|
||||
pkg_data = Data(__name__)
|
||||
|
||||
|
||||
def dummy_ca(path):
|
||||
"""
|
||||
Creates a dummy CA, and writes it to path.
|
||||
|
||||
This function also creates the necessary directories if they don't exist.
|
||||
|
||||
Returns True if operation succeeded, False if not.
|
||||
"""
|
||||
dirname = os.path.dirname(path)
|
||||
if not os.path.exists(dirname):
|
||||
os.makedirs(dirname)
|
||||
|
||||
if path.endswith(".pem"):
|
||||
basename, _ = os.path.splitext(path)
|
||||
else:
|
||||
basename = path
|
||||
|
||||
cmd = [
|
||||
"openssl",
|
||||
"req",
|
||||
"-new",
|
||||
"-x509",
|
||||
"-config", pkg_data.path("resources/ca.cnf"),
|
||||
"-nodes",
|
||||
"-days", CERT_EXPIRY,
|
||||
"-out", path,
|
||||
"-newkey", "rsa:1024",
|
||||
"-keyout", path,
|
||||
]
|
||||
ret = subprocess.call(
|
||||
cmd,
|
||||
stderr=subprocess.PIPE,
|
||||
stdout=subprocess.PIPE,
|
||||
stdin=subprocess.PIPE
|
||||
)
|
||||
# begin nocover
|
||||
if ret:
|
||||
return False
|
||||
# end nocover
|
||||
|
||||
cmd = [
|
||||
"openssl",
|
||||
"pkcs12",
|
||||
"-export",
|
||||
"-password", "pass:",
|
||||
"-nokeys",
|
||||
"-in", path,
|
||||
"-out", os.path.join(dirname, basename + "-cert.p12")
|
||||
]
|
||||
ret = subprocess.call(
|
||||
cmd,
|
||||
stderr=subprocess.PIPE,
|
||||
stdout=subprocess.PIPE,
|
||||
stdin=subprocess.PIPE
|
||||
)
|
||||
# begin nocover
|
||||
if ret:
|
||||
return False
|
||||
# end nocover
|
||||
cmd = [
|
||||
"openssl",
|
||||
"x509",
|
||||
"-in", path,
|
||||
"-out", os.path.join(dirname, basename + "-cert.pem")
|
||||
]
|
||||
ret = subprocess.call(
|
||||
cmd,
|
||||
stderr=subprocess.PIPE,
|
||||
stdout=subprocess.PIPE,
|
||||
stdin=subprocess.PIPE
|
||||
)
|
||||
# begin nocover
|
||||
if ret:
|
||||
return False
|
||||
# end nocover
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def dummy_cert(certdir, ca, commonname):
|
||||
"""
|
||||
certdir: Certificate directory.
|
||||
ca: Path to the certificate authority file, or None.
|
||||
commonname: Common name for the generated certificate.
|
||||
|
||||
Returns cert path if operation succeeded, None if not.
|
||||
"""
|
||||
namehash = hashlib.sha256(commonname).hexdigest()
|
||||
certpath = os.path.join(certdir, namehash + ".pem")
|
||||
if os.path.exists(certpath):
|
||||
return certpath
|
||||
|
||||
confpath = os.path.join(certdir, namehash + ".cnf")
|
||||
reqpath = os.path.join(certdir, namehash + ".req")
|
||||
|
||||
template = open(pkg_data.path("resources/cert.cnf")).read()
|
||||
f = open(confpath, "w")
|
||||
f.write(template%(dict(commonname=commonname)))
|
||||
f.close()
|
||||
|
||||
if ca:
|
||||
# Create a dummy signed certificate. Uses same key as the signing CA
|
||||
cmd = [
|
||||
"openssl",
|
||||
"req",
|
||||
"-new",
|
||||
"-config", confpath,
|
||||
"-out", reqpath,
|
||||
"-key", ca,
|
||||
]
|
||||
ret = subprocess.call(
|
||||
cmd,
|
||||
stderr=subprocess.PIPE,
|
||||
stdout=subprocess.PIPE,
|
||||
stdin=subprocess.PIPE
|
||||
)
|
||||
if ret: return None
|
||||
cmd = [
|
||||
"openssl",
|
||||
"x509",
|
||||
"-req",
|
||||
"-in", reqpath,
|
||||
"-days", CERT_EXPIRY,
|
||||
"-out", certpath,
|
||||
"-CA", ca,
|
||||
"-CAcreateserial",
|
||||
"-extfile", confpath,
|
||||
"-extensions", "v3_cert",
|
||||
]
|
||||
ret = subprocess.call(
|
||||
cmd,
|
||||
stderr=subprocess.PIPE,
|
||||
stdout=subprocess.PIPE,
|
||||
stdin=subprocess.PIPE
|
||||
)
|
||||
if ret: return None
|
||||
else:
|
||||
# Create a new selfsigned certificate + key
|
||||
cmd = [
|
||||
"openssl",
|
||||
"req",
|
||||
"-new",
|
||||
"-x509",
|
||||
"-config", confpath,
|
||||
"-nodes",
|
||||
"-days", CERT_EXPIRY,
|
||||
"-out", certpath,
|
||||
"-newkey", "rsa:1024",
|
||||
"-keyout", certpath,
|
||||
]
|
||||
ret = subprocess.call(
|
||||
cmd,
|
||||
stderr=subprocess.PIPE,
|
||||
stdout=subprocess.PIPE,
|
||||
stdin=subprocess.PIPE
|
||||
)
|
||||
if ret: return None
|
||||
return certpath
|
||||
|
||||
|
||||
class LRUCache:
|
||||
"""
|
||||
A decorator that implements a self-expiring LRU cache for class
|
||||
@@ -427,6 +226,33 @@ def parse_proxy_spec(url):
|
||||
return p[:3]
|
||||
|
||||
|
||||
def parse_content_type(c):
|
||||
"""
|
||||
A simple parser for content-type values. Returns a (type, subtype,
|
||||
parameters) tuple, where type and subtype are strings, and parameters
|
||||
is a dict. If the string could not be parsed, return None.
|
||||
|
||||
E.g. the following string:
|
||||
|
||||
text/html; charset=UTF-8
|
||||
|
||||
Returns:
|
||||
|
||||
("text", "html", {"charset": "UTF-8"})
|
||||
"""
|
||||
parts = c.split(";", 1)
|
||||
ts = parts[0].split("/", 1)
|
||||
if len(ts) != 2:
|
||||
return None
|
||||
d = {}
|
||||
if len(parts) == 2:
|
||||
for i in parts[1].split(";"):
|
||||
clause = i.split("=", 1)
|
||||
if len(clause) == 2:
|
||||
d[clause[0].strip()] = clause[1].strip()
|
||||
return ts[0].lower(), ts[1].lower(), d
|
||||
|
||||
|
||||
def hostport(scheme, host, port):
|
||||
"""
|
||||
Returns the host component, with a port specifcation if needed.
|
||||
|
||||
@@ -1,2 +1,2 @@
|
||||
IVERSION = (0, 7)
|
||||
IVERSION = (0, 8, 1)
|
||||
VERSION = ".".join(str(i) for i in IVERSION)
|
||||
|
||||
7
mitmdump
@@ -40,7 +40,7 @@ if __name__ == '__main__':
|
||||
|
||||
proxyconfig = proxy.process_proxy_options(parser, options)
|
||||
if options.no_server:
|
||||
server = None
|
||||
server = proxy.DummyServer(proxyconfig)
|
||||
else:
|
||||
try:
|
||||
server = proxy.ProxyServer(proxyconfig, options.port, options.addr)
|
||||
@@ -49,7 +49,10 @@ if __name__ == '__main__':
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
dumpopts = dump.Options(**cmdline.get_common_options(options))
|
||||
try:
|
||||
dumpopts = dump.Options(**cmdline.get_common_options(options))
|
||||
except cmdline.OptionException, v:
|
||||
parser.error(v.message)
|
||||
dumpopts.keepserving = options.keepserving
|
||||
|
||||
if args:
|
||||
|
||||
@@ -46,7 +46,7 @@ if __name__ == '__main__':
|
||||
config = proxy.process_proxy_options(parser, options)
|
||||
|
||||
if options.no_server:
|
||||
server = None
|
||||
server = proxy.DummyServer(config)
|
||||
else:
|
||||
try:
|
||||
server = proxy.ProxyServer(config, options.port, options.addr)
|
||||
@@ -54,7 +54,10 @@ if __name__ == '__main__':
|
||||
print >> sys.stderr, "mitmproxy:", v.args[0]
|
||||
sys.exit(1)
|
||||
|
||||
opts = console.Options(**cmdline.get_common_options(options))
|
||||
try:
|
||||
opts = console.Options(**cmdline.get_common_options(options))
|
||||
except cmdline.OptionException, v:
|
||||
parser.error(v.message)
|
||||
opts.intercept = options.intercept
|
||||
opts.debug = options.debug
|
||||
m = console.ConsoleMaster(server, opts)
|
||||
|
||||
2
setup.py
@@ -92,5 +92,5 @@ setup(
|
||||
"Topic :: Internet :: Proxy Servers",
|
||||
"Topic :: Software Development :: Testing"
|
||||
],
|
||||
install_requires=['urwid'],
|
||||
install_requires=['urwid==1.0.1', 'pyasn1', 'pyopenssl>=0.12', "PIL", "lxml"],
|
||||
)
|
||||
|
||||
BIN
test/data/dercert
Normal file
BIN
test/data/image-err1.jpg
Normal file
|
After Width: | Height: | Size: 81 KiB |
BIN
test/data/image.gif
Normal file
|
After Width: | Height: | Size: 2.3 KiB |
BIN
test/data/image.ico
Normal file
|
After Width: | Height: | Size: 11 KiB |
BIN
test/data/image.jpg
Normal file
|
After Width: | Height: | Size: 1.5 KiB |
BIN
test/data/image.png
Normal file
|
After Width: | Height: | Size: 9.1 KiB |
1
test/data/replace
Normal file
@@ -0,0 +1 @@
|
||||
replacecontents
|
||||
145
test/data/text_cert
Normal file
@@ -0,0 +1,145 @@
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIadTCCGd6gAwIBAgIGR09PUAFtMA0GCSqGSIb3DQEBBQUAMEYxCzAJBgNVBAYT
|
||||
AlVTMRMwEQYDVQQKEwpHb29nbGUgSW5jMSIwIAYDVQQDExlHb29nbGUgSW50ZXJu
|
||||
ZXQgQXV0aG9yaXR5MB4XDTEyMDExNzEyNTUwNFoXDTEzMDExNzEyNTUwNFowTDEL
|
||||
MAkGA1UEBhMCVVMxEzARBgNVBAgTCkNhbGlmb3JuaWExEzARBgNVBAoTCkdvb2ds
|
||||
ZSBJbmMxEzARBgNVBAMTCmdvb2dsZS5jb20wgZ8wDQYJKoZIhvcNAQEBBQADgY0A
|
||||
MIGJAoGBALofcxR2fud5cyFIeld9pj2vGB5GH0y9tmAYa5t33xbJguKKX/el3tXA
|
||||
KMNiT1SZzu8ELJ1Ey0GcBAgHA9jVPQd0LGdbEtNIxjblAsWAD/FZlSt8X87h7C5w
|
||||
2JSefOani0qgQqU6sTdsaCUGZ+Eu7D0lBfT5/Vnl2vV+zI3YmDlpAgMBAAGjghhm
|
||||
MIIYYjAdBgNVHQ4EFgQUL3+JeC/oL9jZhTp3F550LautzV8wHwYDVR0jBBgwFoAU
|
||||
v8Aw6/VDET5nup6R+/xq2uNrEiQwWwYDVR0fBFQwUjBQoE6gTIZKaHR0cDovL3d3
|
||||
dy5nc3RhdGljLmNvbS9Hb29nbGVJbnRlcm5ldEF1dGhvcml0eS9Hb29nbGVJbnRl
|
||||
cm5ldEF1dGhvcml0eS5jcmwwZgYIKwYBBQUHAQEEWjBYMFYGCCsGAQUFBzAChkpo
|
||||
dHRwOi8vd3d3LmdzdGF0aWMuY29tL0dvb2dsZUludGVybmV0QXV0aG9yaXR5L0dv
|
||||
b2dsZUludGVybmV0QXV0aG9yaXR5LmNydDCCF1kGA1UdEQSCF1AwghdMggpnb29n
|
||||
bGUuY29tggwqLmdvb2dsZS5jb22CCyouZ29vZ2xlLmFjggsqLmdvb2dsZS5hZIIL
|
||||
Ki5nb29nbGUuYWWCCyouZ29vZ2xlLmFmggsqLmdvb2dsZS5hZ4ILKi5nb29nbGUu
|
||||
YW2CCyouZ29vZ2xlLmFzggsqLmdvb2dsZS5hdIILKi5nb29nbGUuYXqCCyouZ29v
|
||||
Z2xlLmJhggsqLmdvb2dsZS5iZYILKi5nb29nbGUuYmaCCyouZ29vZ2xlLmJnggsq
|
||||
Lmdvb2dsZS5iaYILKi5nb29nbGUuYmqCCyouZ29vZ2xlLmJzggsqLmdvb2dsZS5i
|
||||
eYILKi5nb29nbGUuY2GCDCouZ29vZ2xlLmNhdIILKi5nb29nbGUuY2OCCyouZ29v
|
||||
Z2xlLmNkggsqLmdvb2dsZS5jZoILKi5nb29nbGUuY2eCCyouZ29vZ2xlLmNoggsq
|
||||
Lmdvb2dsZS5jaYILKi5nb29nbGUuY2yCCyouZ29vZ2xlLmNtggsqLmdvb2dsZS5j
|
||||
boIOKi5nb29nbGUuY28uYW+CDiouZ29vZ2xlLmNvLmJ3gg4qLmdvb2dsZS5jby5j
|
||||
a4IOKi5nb29nbGUuY28uY3KCDiouZ29vZ2xlLmNvLmh1gg4qLmdvb2dsZS5jby5p
|
||||
ZIIOKi5nb29nbGUuY28uaWyCDiouZ29vZ2xlLmNvLmltgg4qLmdvb2dsZS5jby5p
|
||||
boIOKi5nb29nbGUuY28uamWCDiouZ29vZ2xlLmNvLmpwgg4qLmdvb2dsZS5jby5r
|
||||
ZYIOKi5nb29nbGUuY28ua3KCDiouZ29vZ2xlLmNvLmxzgg4qLmdvb2dsZS5jby5t
|
||||
YYIOKi5nb29nbGUuY28ubXqCDiouZ29vZ2xlLmNvLm56gg4qLmdvb2dsZS5jby50
|
||||
aIIOKi5nb29nbGUuY28udHqCDiouZ29vZ2xlLmNvLnVngg4qLmdvb2dsZS5jby51
|
||||
a4IOKi5nb29nbGUuY28udXqCDiouZ29vZ2xlLmNvLnZlgg4qLmdvb2dsZS5jby52
|
||||
aYIOKi5nb29nbGUuY28uemGCDiouZ29vZ2xlLmNvLnptgg4qLmdvb2dsZS5jby56
|
||||
d4IPKi5nb29nbGUuY29tLmFmgg8qLmdvb2dsZS5jb20uYWeCDyouZ29vZ2xlLmNv
|
||||
bS5haYIPKi5nb29nbGUuY29tLmFygg8qLmdvb2dsZS5jb20uYXWCDyouZ29vZ2xl
|
||||
LmNvbS5iZIIPKi5nb29nbGUuY29tLmJogg8qLmdvb2dsZS5jb20uYm6CDyouZ29v
|
||||
Z2xlLmNvbS5ib4IPKi5nb29nbGUuY29tLmJygg8qLmdvb2dsZS5jb20uYnmCDyou
|
||||
Z29vZ2xlLmNvbS5ieoIPKi5nb29nbGUuY29tLmNugg8qLmdvb2dsZS5jb20uY2+C
|
||||
DyouZ29vZ2xlLmNvbS5jdYIPKi5nb29nbGUuY29tLmN5gg8qLmdvb2dsZS5jb20u
|
||||
ZG+CDyouZ29vZ2xlLmNvbS5lY4IPKi5nb29nbGUuY29tLmVngg8qLmdvb2dsZS5j
|
||||
b20uZXSCDyouZ29vZ2xlLmNvbS5maoIPKi5nb29nbGUuY29tLmdlgg8qLmdvb2ds
|
||||
ZS5jb20uZ2iCDyouZ29vZ2xlLmNvbS5naYIPKi5nb29nbGUuY29tLmdygg8qLmdv
|
||||
b2dsZS5jb20uZ3SCDyouZ29vZ2xlLmNvbS5oa4IPKi5nb29nbGUuY29tLmlxgg8q
|
||||
Lmdvb2dsZS5jb20uam2CDyouZ29vZ2xlLmNvbS5qb4IPKi5nb29nbGUuY29tLmto
|
||||
gg8qLmdvb2dsZS5jb20ua3eCDyouZ29vZ2xlLmNvbS5sYoIPKi5nb29nbGUuY29t
|
||||
Lmx5gg8qLmdvb2dsZS5jb20ubXSCDyouZ29vZ2xlLmNvbS5teIIPKi5nb29nbGUu
|
||||
Y29tLm15gg8qLmdvb2dsZS5jb20ubmGCDyouZ29vZ2xlLmNvbS5uZoIPKi5nb29n
|
||||
bGUuY29tLm5ngg8qLmdvb2dsZS5jb20ubmmCDyouZ29vZ2xlLmNvbS5ucIIPKi5n
|
||||
b29nbGUuY29tLm5ygg8qLmdvb2dsZS5jb20ub22CDyouZ29vZ2xlLmNvbS5wYYIP
|
||||
Ki5nb29nbGUuY29tLnBlgg8qLmdvb2dsZS5jb20ucGiCDyouZ29vZ2xlLmNvbS5w
|
||||
a4IPKi5nb29nbGUuY29tLnBsgg8qLmdvb2dsZS5jb20ucHKCDyouZ29vZ2xlLmNv
|
||||
bS5weYIPKi5nb29nbGUuY29tLnFhgg8qLmdvb2dsZS5jb20ucnWCDyouZ29vZ2xl
|
||||
LmNvbS5zYYIPKi5nb29nbGUuY29tLnNigg8qLmdvb2dsZS5jb20uc2eCDyouZ29v
|
||||
Z2xlLmNvbS5zbIIPKi5nb29nbGUuY29tLnN2gg8qLmdvb2dsZS5jb20udGqCDyou
|
||||
Z29vZ2xlLmNvbS50boIPKi5nb29nbGUuY29tLnRygg8qLmdvb2dsZS5jb20udHeC
|
||||
DyouZ29vZ2xlLmNvbS51YYIPKi5nb29nbGUuY29tLnV5gg8qLmdvb2dsZS5jb20u
|
||||
dmOCDyouZ29vZ2xlLmNvbS52ZYIPKi5nb29nbGUuY29tLnZuggsqLmdvb2dsZS5j
|
||||
doILKi5nb29nbGUuY3qCCyouZ29vZ2xlLmRlggsqLmdvb2dsZS5kaoILKi5nb29n
|
||||
bGUuZGuCCyouZ29vZ2xlLmRtggsqLmdvb2dsZS5keoILKi5nb29nbGUuZWWCCyou
|
||||
Z29vZ2xlLmVzggsqLmdvb2dsZS5maYILKi5nb29nbGUuZm2CCyouZ29vZ2xlLmZy
|
||||
ggsqLmdvb2dsZS5nYYILKi5nb29nbGUuZ2WCCyouZ29vZ2xlLmdnggsqLmdvb2ds
|
||||
ZS5nbIILKi5nb29nbGUuZ22CCyouZ29vZ2xlLmdwggsqLmdvb2dsZS5ncoILKi5n
|
||||
b29nbGUuZ3mCCyouZ29vZ2xlLmhrggsqLmdvb2dsZS5oboILKi5nb29nbGUuaHKC
|
||||
CyouZ29vZ2xlLmh0ggsqLmdvb2dsZS5odYILKi5nb29nbGUuaWWCCyouZ29vZ2xl
|
||||
Lmltgg0qLmdvb2dsZS5pbmZvggsqLmdvb2dsZS5pcYILKi5nb29nbGUuaXOCCyou
|
||||
Z29vZ2xlLml0gg4qLmdvb2dsZS5pdC5hb4ILKi5nb29nbGUuamWCCyouZ29vZ2xl
|
||||
Lmpvgg0qLmdvb2dsZS5qb2JzggsqLmdvb2dsZS5qcIILKi5nb29nbGUua2eCCyou
|
||||
Z29vZ2xlLmtpggsqLmdvb2dsZS5reoILKi5nb29nbGUubGGCCyouZ29vZ2xlLmxp
|
||||
ggsqLmdvb2dsZS5sa4ILKi5nb29nbGUubHSCCyouZ29vZ2xlLmx1ggsqLmdvb2ds
|
||||
ZS5sdoILKi5nb29nbGUubWSCCyouZ29vZ2xlLm1lggsqLmdvb2dsZS5tZ4ILKi5n
|
||||
b29nbGUubWuCCyouZ29vZ2xlLm1sggsqLmdvb2dsZS5tboILKi5nb29nbGUubXOC
|
||||
CyouZ29vZ2xlLm11ggsqLmdvb2dsZS5tdoILKi5nb29nbGUubXeCCyouZ29vZ2xl
|
||||
Lm5lgg4qLmdvb2dsZS5uZS5qcIIMKi5nb29nbGUubmV0ggsqLmdvb2dsZS5ubIIL
|
||||
Ki5nb29nbGUubm+CCyouZ29vZ2xlLm5yggsqLmdvb2dsZS5udYIPKi5nb29nbGUu
|
||||
b2ZmLmFpggsqLmdvb2dsZS5wa4ILKi5nb29nbGUucGyCCyouZ29vZ2xlLnBuggsq
|
||||
Lmdvb2dsZS5wc4ILKi5nb29nbGUucHSCCyouZ29vZ2xlLnJvggsqLmdvb2dsZS5y
|
||||
c4ILKi5nb29nbGUucnWCCyouZ29vZ2xlLnJ3ggsqLmdvb2dsZS5zY4ILKi5nb29n
|
||||
bGUuc2WCCyouZ29vZ2xlLnNoggsqLmdvb2dsZS5zaYILKi5nb29nbGUuc2uCCyou
|
||||
Z29vZ2xlLnNtggsqLmdvb2dsZS5zboILKi5nb29nbGUuc2+CCyouZ29vZ2xlLnN0
|
||||
ggsqLmdvb2dsZS50ZIILKi5nb29nbGUudGeCCyouZ29vZ2xlLnRrggsqLmdvb2ds
|
||||
ZS50bIILKi5nb29nbGUudG2CCyouZ29vZ2xlLnRuggsqLmdvb2dsZS50b4ILKi5n
|
||||
b29nbGUudHCCCyouZ29vZ2xlLnR0ggsqLmdvb2dsZS51c4ILKi5nb29nbGUudXqC
|
||||
CyouZ29vZ2xlLnZnggsqLmdvb2dsZS52dYILKi5nb29nbGUud3OCCWdvb2dsZS5h
|
||||
Y4IJZ29vZ2xlLmFkgglnb29nbGUuYWWCCWdvb2dsZS5hZoIJZ29vZ2xlLmFnggln
|
||||
b29nbGUuYW2CCWdvb2dsZS5hc4IJZ29vZ2xlLmF0gglnb29nbGUuYXqCCWdvb2ds
|
||||
ZS5iYYIJZ29vZ2xlLmJlgglnb29nbGUuYmaCCWdvb2dsZS5iZ4IJZ29vZ2xlLmJp
|
||||
gglnb29nbGUuYmqCCWdvb2dsZS5ic4IJZ29vZ2xlLmJ5gglnb29nbGUuY2GCCmdv
|
||||
b2dsZS5jYXSCCWdvb2dsZS5jY4IJZ29vZ2xlLmNkgglnb29nbGUuY2aCCWdvb2ds
|
||||
ZS5jZ4IJZ29vZ2xlLmNogglnb29nbGUuY2mCCWdvb2dsZS5jbIIJZ29vZ2xlLmNt
|
||||
gglnb29nbGUuY26CDGdvb2dsZS5jby5hb4IMZ29vZ2xlLmNvLmJ3ggxnb29nbGUu
|
||||
Y28uY2uCDGdvb2dsZS5jby5jcoIMZ29vZ2xlLmNvLmh1ggxnb29nbGUuY28uaWSC
|
||||
DGdvb2dsZS5jby5pbIIMZ29vZ2xlLmNvLmltggxnb29nbGUuY28uaW6CDGdvb2ds
|
||||
ZS5jby5qZYIMZ29vZ2xlLmNvLmpwggxnb29nbGUuY28ua2WCDGdvb2dsZS5jby5r
|
||||
coIMZ29vZ2xlLmNvLmxzggxnb29nbGUuY28ubWGCDGdvb2dsZS5jby5teoIMZ29v
|
||||
Z2xlLmNvLm56ggxnb29nbGUuY28udGiCDGdvb2dsZS5jby50eoIMZ29vZ2xlLmNv
|
||||
LnVnggxnb29nbGUuY28udWuCDGdvb2dsZS5jby51eoIMZ29vZ2xlLmNvLnZlggxn
|
||||
b29nbGUuY28udmmCDGdvb2dsZS5jby56YYIMZ29vZ2xlLmNvLnptggxnb29nbGUu
|
||||
Y28ueneCDWdvb2dsZS5jb20uYWaCDWdvb2dsZS5jb20uYWeCDWdvb2dsZS5jb20u
|
||||
YWmCDWdvb2dsZS5jb20uYXKCDWdvb2dsZS5jb20uYXWCDWdvb2dsZS5jb20uYmSC
|
||||
DWdvb2dsZS5jb20uYmiCDWdvb2dsZS5jb20uYm6CDWdvb2dsZS5jb20uYm+CDWdv
|
||||
b2dsZS5jb20uYnKCDWdvb2dsZS5jb20uYnmCDWdvb2dsZS5jb20uYnqCDWdvb2ds
|
||||
ZS5jb20uY26CDWdvb2dsZS5jb20uY2+CDWdvb2dsZS5jb20uY3WCDWdvb2dsZS5j
|
||||
b20uY3mCDWdvb2dsZS5jb20uZG+CDWdvb2dsZS5jb20uZWOCDWdvb2dsZS5jb20u
|
||||
ZWeCDWdvb2dsZS5jb20uZXSCDWdvb2dsZS5jb20uZmqCDWdvb2dsZS5jb20uZ2WC
|
||||
DWdvb2dsZS5jb20uZ2iCDWdvb2dsZS5jb20uZ2mCDWdvb2dsZS5jb20uZ3KCDWdv
|
||||
b2dsZS5jb20uZ3SCDWdvb2dsZS5jb20uaGuCDWdvb2dsZS5jb20uaXGCDWdvb2ds
|
||||
ZS5jb20uam2CDWdvb2dsZS5jb20uam+CDWdvb2dsZS5jb20ua2iCDWdvb2dsZS5j
|
||||
b20ua3eCDWdvb2dsZS5jb20ubGKCDWdvb2dsZS5jb20ubHmCDWdvb2dsZS5jb20u
|
||||
bXSCDWdvb2dsZS5jb20ubXiCDWdvb2dsZS5jb20ubXmCDWdvb2dsZS5jb20ubmGC
|
||||
DWdvb2dsZS5jb20ubmaCDWdvb2dsZS5jb20ubmeCDWdvb2dsZS5jb20ubmmCDWdv
|
||||
b2dsZS5jb20ubnCCDWdvb2dsZS5jb20ubnKCDWdvb2dsZS5jb20ub22CDWdvb2ds
|
||||
ZS5jb20ucGGCDWdvb2dsZS5jb20ucGWCDWdvb2dsZS5jb20ucGiCDWdvb2dsZS5j
|
||||
b20ucGuCDWdvb2dsZS5jb20ucGyCDWdvb2dsZS5jb20ucHKCDWdvb2dsZS5jb20u
|
||||
cHmCDWdvb2dsZS5jb20ucWGCDWdvb2dsZS5jb20ucnWCDWdvb2dsZS5jb20uc2GC
|
||||
DWdvb2dsZS5jb20uc2KCDWdvb2dsZS5jb20uc2eCDWdvb2dsZS5jb20uc2yCDWdv
|
||||
b2dsZS5jb20uc3aCDWdvb2dsZS5jb20udGqCDWdvb2dsZS5jb20udG6CDWdvb2ds
|
||||
ZS5jb20udHKCDWdvb2dsZS5jb20udHeCDWdvb2dsZS5jb20udWGCDWdvb2dsZS5j
|
||||
b20udXmCDWdvb2dsZS5jb20udmOCDWdvb2dsZS5jb20udmWCDWdvb2dsZS5jb20u
|
||||
dm6CCWdvb2dsZS5jdoIJZ29vZ2xlLmN6gglnb29nbGUuZGWCCWdvb2dsZS5kaoIJ
|
||||
Z29vZ2xlLmRrgglnb29nbGUuZG2CCWdvb2dsZS5keoIJZ29vZ2xlLmVlgglnb29n
|
||||
bGUuZXOCCWdvb2dsZS5maYIJZ29vZ2xlLmZtgglnb29nbGUuZnKCCWdvb2dsZS5n
|
||||
YYIJZ29vZ2xlLmdlgglnb29nbGUuZ2eCCWdvb2dsZS5nbIIJZ29vZ2xlLmdtggln
|
||||
b29nbGUuZ3CCCWdvb2dsZS5ncoIJZ29vZ2xlLmd5gglnb29nbGUuaGuCCWdvb2ds
|
||||
ZS5oboIJZ29vZ2xlLmhygglnb29nbGUuaHSCCWdvb2dsZS5odYIJZ29vZ2xlLmll
|
||||
gglnb29nbGUuaW2CC2dvb2dsZS5pbmZvgglnb29nbGUuaXGCCWdvb2dsZS5pc4IJ
|
||||
Z29vZ2xlLml0ggxnb29nbGUuaXQuYW+CCWdvb2dsZS5qZYIJZ29vZ2xlLmpvggtn
|
||||
b29nbGUuam9ic4IJZ29vZ2xlLmpwgglnb29nbGUua2eCCWdvb2dsZS5raYIJZ29v
|
||||
Z2xlLmt6gglnb29nbGUubGGCCWdvb2dsZS5saYIJZ29vZ2xlLmxrgglnb29nbGUu
|
||||
bHSCCWdvb2dsZS5sdYIJZ29vZ2xlLmx2gglnb29nbGUubWSCCWdvb2dsZS5tZYIJ
|
||||
Z29vZ2xlLm1ngglnb29nbGUubWuCCWdvb2dsZS5tbIIJZ29vZ2xlLm1ugglnb29n
|
||||
bGUubXOCCWdvb2dsZS5tdYIJZ29vZ2xlLm12gglnb29nbGUubXeCCWdvb2dsZS5u
|
||||
ZYIMZ29vZ2xlLm5lLmpwggpnb29nbGUubmV0gglnb29nbGUubmyCCWdvb2dsZS5u
|
||||
b4IJZ29vZ2xlLm5ygglnb29nbGUubnWCDWdvb2dsZS5vZmYuYWmCCWdvb2dsZS5w
|
||||
a4IJZ29vZ2xlLnBsgglnb29nbGUucG6CCWdvb2dsZS5wc4IJZ29vZ2xlLnB0ggln
|
||||
b29nbGUucm+CCWdvb2dsZS5yc4IJZ29vZ2xlLnJ1gglnb29nbGUucneCCWdvb2ds
|
||||
ZS5zY4IJZ29vZ2xlLnNlgglnb29nbGUuc2iCCWdvb2dsZS5zaYIJZ29vZ2xlLnNr
|
||||
gglnb29nbGUuc22CCWdvb2dsZS5zboIJZ29vZ2xlLnNvgglnb29nbGUuc3SCCWdv
|
||||
b2dsZS50ZIIJZ29vZ2xlLnRngglnb29nbGUudGuCCWdvb2dsZS50bIIJZ29vZ2xl
|
||||
LnRtgglnb29nbGUudG6CCWdvb2dsZS50b4IJZ29vZ2xlLnRwgglnb29nbGUudHSC
|
||||
CWdvb2dsZS51c4IJZ29vZ2xlLnV6gglnb29nbGUudmeCCWdvb2dsZS52dYIJZ29v
|
||||
Z2xlLndzMA0GCSqGSIb3DQEBBQUAA4GBAJmZ9RyqpUzrP0UcJnHXoLu/AjIEsIvZ
|
||||
Y9hq/9bLry8InfmvERYHr4hNetkOYlW0FeDZtCpWxdPUgJjmWgKAK6j0goOFavTV
|
||||
GptkL8gha4p1QUsdLkd36/cvBXeBYSle787veo46N1k4V6Uv2gaDVkre786CNsHv
|
||||
Q6MYZ5ClQ+kS
|
||||
-----END CERTIFICATE-----
|
||||
|
||||
39
test/data/text_cert_2
Normal file
@@ -0,0 +1,39 @@
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIGujCCBaKgAwIBAgIDAQlEMA0GCSqGSIb3DQEBBQUAMIGMMQswCQYDVQQGEwJJ
|
||||
TDEWMBQGA1UEChMNU3RhcnRDb20gTHRkLjErMCkGA1UECxMiU2VjdXJlIERpZ2l0
|
||||
YWwgQ2VydGlmaWNhdGUgU2lnbmluZzE4MDYGA1UEAxMvU3RhcnRDb20gQ2xhc3Mg
|
||||
MSBQcmltYXJ5IEludGVybWVkaWF0ZSBTZXJ2ZXIgQ0EwHhcNMTAwMTExMTkyNzM2
|
||||
WhcNMTEwMTEyMDkxNDU1WjCBtDEgMB4GA1UEDRMXMTI2ODMyLU1DeExzWTZUbjFn
|
||||
bTdvOTAxCzAJBgNVBAYTAk5aMR4wHAYDVQQKExVQZXJzb25hIE5vdCBWYWxpZGF0
|
||||
ZWQxKTAnBgNVBAsTIFN0YXJ0Q29tIEZyZWUgQ2VydGlmaWNhdGUgTWVtYmVyMRgw
|
||||
FgYDVQQDEw93d3cuaW5vZGUuY28ubnoxHjAcBgkqhkiG9w0BCQEWD2ppbUBpbm9k
|
||||
ZS5jby5uejCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAL6ghWlGhqg+
|
||||
V0P58R3SvLRiO9OrdekDxzmQbKwQcc05frnF5Z9vT6ga7YOuXVeXxhYCAo0nr6KI
|
||||
+y/Lx+QHvP5W0nKbs+svzUQErq2ZZFwhh1e1LbVccrNwkHUzKOq0TTaVdU4k8kDQ
|
||||
zzYF9tTZb+G5Hv1BJjpwYwe8P4cAiPJPrFFOKTySzHqiYsXlx+vR1l1e3zKavhd+
|
||||
LVSoLWWXb13yKODq6vnuiHjUJXl8CfVlBhoGotXU4JR5cbuGoW/8+rkwEdX+YoCv
|
||||
VCqgdx9IkRFB6uWfN6ocUiFvhA0eknO+ewuVfRLiIaSDB8pNyUWVqu4ngFWtWO1O
|
||||
YZg0I/32BkcCAwEAAaOCAvkwggL1MAkGA1UdEwQCMAAwCwYDVR0PBAQDAgOoMBMG
|
||||
A1UdJQQMMAoGCCsGAQUFBwMBMB0GA1UdDgQWBBQfaL2Rj6r8iRlBTgppgE7ZZ5WT
|
||||
UzAfBgNVHSMEGDAWgBTrQjTQmLCrn/Qbawj3zGQu7w4sRTAnBgNVHREEIDAegg93
|
||||
d3cuaW5vZGUuY28ubnqCC2lub2RlLmNvLm56MIIBQgYDVR0gBIIBOTCCATUwggEx
|
||||
BgsrBgEEAYG1NwECATCCASAwLgYIKwYBBQUHAgEWImh0dHA6Ly93d3cuc3RhcnRz
|
||||
c2wuY29tL3BvbGljeS5wZGYwNAYIKwYBBQUHAgEWKGh0dHA6Ly93d3cuc3RhcnRz
|
||||
c2wuY29tL2ludGVybWVkaWF0ZS5wZGYwgbcGCCsGAQUFBwICMIGqMBQWDVN0YXJ0
|
||||
Q29tIEx0ZC4wAwIBARqBkUxpbWl0ZWQgTGlhYmlsaXR5LCBzZWUgc2VjdGlvbiAq
|
||||
TGVnYWwgTGltaXRhdGlvbnMqIG9mIHRoZSBTdGFydENvbSBDZXJ0aWZpY2F0aW9u
|
||||
IEF1dGhvcml0eSBQb2xpY3kgYXZhaWxhYmxlIGF0IGh0dHA6Ly93d3cuc3RhcnRz
|
||||
c2wuY29tL3BvbGljeS5wZGYwYQYDVR0fBFowWDAqoCigJoYkaHR0cDovL3d3dy5z
|
||||
dGFydHNzbC5jb20vY3J0MS1jcmwuY3JsMCqgKKAmhiRodHRwOi8vY3JsLnN0YXJ0
|
||||
c3NsLmNvbS9jcnQxLWNybC5jcmwwgY4GCCsGAQUFBwEBBIGBMH8wOQYIKwYBBQUH
|
||||
MAGGLWh0dHA6Ly9vY3NwLnN0YXJ0c3NsLmNvbS9zdWIvY2xhc3MxL3NlcnZlci9j
|
||||
YTBCBggrBgEFBQcwAoY2aHR0cDovL3d3dy5zdGFydHNzbC5jb20vY2VydHMvc3Vi
|
||||
LmNsYXNzMS5zZXJ2ZXIuY2EuY3J0MCMGA1UdEgQcMBqGGGh0dHA6Ly93d3cuc3Rh
|
||||
cnRzc2wuY29tLzANBgkqhkiG9w0BAQUFAAOCAQEAivWID0KT8q1EzWzy+BecsFry
|
||||
hQhuLFfAsPkHqpNd9OfkRStGBuJlLX+9DQ9TzjqutdY2buNBuDn71buZK+Y5fmjr
|
||||
28rAT6+WMd+KnCl5WLT5IOS6Z9s3cec5TFQbmOGlepSS9Q6Ts9KsXOHHQvDkQeDq
|
||||
OV2UqdgXIAyFm5efSL9JXPXntRausNu2s8F2B2rRJe4jPfnUy2LvY8OW1YvjUA++
|
||||
vpdWRdfUbJQp55mRfaYMPRnyUm30lAI27QaxgQPFOqDeZUm5llb5eFG/B3f87uhg
|
||||
+Y1oEykbEvZrIFN4hithioQ0tb+57FKkkG2sW3uemNiQw2qrEo/GAMb1cI50Rg==
|
||||
-----END CERTIFICATE-----
|
||||
|
||||
80
test/test_certutils.py
Normal file
@@ -0,0 +1,80 @@
|
||||
import os
|
||||
import libpry
|
||||
from libmproxy import certutils
|
||||
|
||||
|
||||
class udummy_ca(libpry.AutoTree):
|
||||
def test_all(self):
|
||||
d = self.tmpdir()
|
||||
path = os.path.join(d, "foo/cert.cnf")
|
||||
assert certutils.dummy_ca(path)
|
||||
assert os.path.exists(path)
|
||||
|
||||
path = os.path.join(d, "foo/cert2.pem")
|
||||
assert certutils.dummy_ca(path)
|
||||
assert os.path.exists(path)
|
||||
assert os.path.exists(os.path.join(d, "foo/cert2-cert.pem"))
|
||||
assert os.path.exists(os.path.join(d, "foo/cert2-cert.p12"))
|
||||
|
||||
|
||||
class udummy_cert(libpry.AutoTree):
|
||||
def test_with_ca(self):
|
||||
d = self.tmpdir()
|
||||
cacert = os.path.join(d, "foo/cert.cnf")
|
||||
assert certutils.dummy_ca(cacert)
|
||||
p = certutils.dummy_cert(
|
||||
os.path.join(d, "foo"),
|
||||
cacert,
|
||||
"foo.com",
|
||||
["one.com", "two.com", "*.three.com"]
|
||||
)
|
||||
assert os.path.exists(p)
|
||||
|
||||
# Short-circuit
|
||||
assert certutils.dummy_cert(
|
||||
os.path.join(d, "foo"),
|
||||
cacert,
|
||||
"foo.com",
|
||||
[]
|
||||
)
|
||||
|
||||
def test_no_ca(self):
|
||||
d = self.tmpdir()
|
||||
p = certutils.dummy_cert(
|
||||
d,
|
||||
None,
|
||||
"foo.com",
|
||||
[]
|
||||
)
|
||||
assert os.path.exists(p)
|
||||
|
||||
|
||||
class uSSLCert(libpry.AutoTree):
|
||||
def test_simple(self):
|
||||
c = certutils.SSLCert(file("data/text_cert", "r").read())
|
||||
assert c.cn == "google.com"
|
||||
assert len(c.altnames) == 436
|
||||
|
||||
c = certutils.SSLCert(file("data/text_cert_2", "r").read())
|
||||
assert c.cn == "www.inode.co.nz"
|
||||
assert len(c.altnames) == 2
|
||||
assert c.digest("sha1")
|
||||
assert c.notbefore
|
||||
assert c.notafter
|
||||
assert c.subject
|
||||
assert c.keyinfo == ("RSA", 2048)
|
||||
assert c.serial
|
||||
assert c.issuer
|
||||
c.has_expired
|
||||
|
||||
def test_der(self):
|
||||
d = file("data/dercert").read()
|
||||
s = certutils.SSLCert.from_der(d)
|
||||
assert s.cn
|
||||
|
||||
|
||||
tests = [
|
||||
udummy_ca(),
|
||||
udummy_cert(),
|
||||
uSSLCert(),
|
||||
]
|
||||
@@ -4,6 +4,37 @@ from libmproxy import cmdline
|
||||
|
||||
|
||||
class uAll(libpry.AutoTree):
|
||||
def test_parse_replace_hook(self):
|
||||
x = cmdline.parse_replace_hook("/foo/bar/voing")
|
||||
assert x == ("foo", "bar", "voing")
|
||||
|
||||
x = cmdline.parse_replace_hook("/foo/bar/vo/ing/")
|
||||
assert x == ("foo", "bar", "vo/ing/")
|
||||
|
||||
x = cmdline.parse_replace_hook("/bar/voing")
|
||||
assert x == (".*", "bar", "voing")
|
||||
|
||||
libpry.raises(
|
||||
cmdline.ParseReplaceException,
|
||||
cmdline.parse_replace_hook,
|
||||
"/foo"
|
||||
)
|
||||
libpry.raises(
|
||||
"replacement regex",
|
||||
cmdline.parse_replace_hook,
|
||||
"patt/[/rep"
|
||||
)
|
||||
libpry.raises(
|
||||
"filter pattern",
|
||||
cmdline.parse_replace_hook,
|
||||
"/~/foo/rep"
|
||||
)
|
||||
libpry.raises(
|
||||
"empty replacement regex",
|
||||
cmdline.parse_replace_hook,
|
||||
"//"
|
||||
)
|
||||
|
||||
def test_common(self):
|
||||
parser = optparse.OptionParser()
|
||||
cmdline.common_options(parser)
|
||||
@@ -25,6 +56,36 @@ class uAll(libpry.AutoTree):
|
||||
assert v["stickycookie"] == "foo"
|
||||
assert v["stickyauth"] == "foo"
|
||||
|
||||
opts.replace = ["/foo/bar/voing"]
|
||||
v = cmdline.get_common_options(opts)
|
||||
assert v["replacements"] == [("foo", "bar", "voing")]
|
||||
|
||||
opts.replace = ["//"]
|
||||
libpry.raises(
|
||||
"empty replacement regex",
|
||||
cmdline.get_common_options,
|
||||
opts
|
||||
)
|
||||
|
||||
opts.replace = []
|
||||
opts.replace_file = [("/foo/bar/nonexistent")]
|
||||
libpry.raises(
|
||||
"could not read replace file",
|
||||
cmdline.get_common_options,
|
||||
opts
|
||||
)
|
||||
|
||||
opts.replace_file = [("/~/bar/nonexistent")]
|
||||
libpry.raises(
|
||||
"filter pattern",
|
||||
cmdline.get_common_options,
|
||||
opts
|
||||
)
|
||||
|
||||
opts.replace_file = [("/foo/bar/./data/replace")]
|
||||
v = cmdline.get_common_options(opts)["replacements"]
|
||||
assert len(v) == 1
|
||||
assert v[0][2].strip() == "replacecontents"
|
||||
|
||||
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ import tutils
|
||||
import libpry
|
||||
|
||||
|
||||
class uState(libpry.AutoTree):
|
||||
class uConsoleState(libpry.AutoTree):
|
||||
def test_flow(self):
|
||||
"""
|
||||
normal flow:
|
||||
@@ -76,6 +76,18 @@ class uState(libpry.AutoTree):
|
||||
assert len(c.view) == 3
|
||||
assert c.focus == 0
|
||||
|
||||
def test_settings(self):
|
||||
c = console.ConsoleState()
|
||||
f = self._add_request(c)
|
||||
c.add_flow_setting(f, "foo", "bar")
|
||||
assert c.get_flow_setting(f, "foo") == "bar"
|
||||
assert c.get_flow_setting(f, "oink") == None
|
||||
assert c.get_flow_setting(f, "oink", "foo") == "foo"
|
||||
assert len(c.flowsettings) == 1
|
||||
c.delete_flow(f)
|
||||
del f
|
||||
assert len(c.flowsettings) == 0
|
||||
|
||||
|
||||
class uformat_keyvals(libpry.AutoTree):
|
||||
def test_simple(self):
|
||||
@@ -101,7 +113,8 @@ class uPathCompleter(libpry.AutoTree):
|
||||
c.reset()
|
||||
assert c.complete("./completion/aaa") == "./completion/aaa"
|
||||
assert c.complete("./completion/aaa") == "./completion/aaa"
|
||||
|
||||
c.reset()
|
||||
assert c.complete("./completion") == "./completion/aaa"
|
||||
|
||||
def test_completion(self):
|
||||
c = console._PathCompleter(True)
|
||||
@@ -139,7 +152,7 @@ class uOptions(libpry.AutoTree):
|
||||
|
||||
tests = [
|
||||
uformat_keyvals(),
|
||||
uState(),
|
||||
uConsoleState(),
|
||||
uPathCompleter(),
|
||||
uOptions()
|
||||
]
|
||||
|
||||
184
test/test_console_contentview.py
Normal file
@@ -0,0 +1,184 @@
|
||||
import sys
|
||||
import libpry
|
||||
import libmproxy.console.contentview as cv
|
||||
from libmproxy import utils, flow, encoding
|
||||
|
||||
class uContentView(libpry.AutoTree):
|
||||
def test_trailer(self):
|
||||
txt = []
|
||||
cv.trailer(5, txt, 1000)
|
||||
assert not txt
|
||||
cv.trailer(cv.VIEW_CUTOFF + 10, txt, cv.VIEW_CUTOFF)
|
||||
assert txt
|
||||
|
||||
def test_get_view_func(self):
|
||||
f = cv.get_view_func(
|
||||
cv.VIEW_HEX,
|
||||
flow.ODictCaseless(),
|
||||
"foo"
|
||||
)
|
||||
assert f is cv.view_hex
|
||||
|
||||
f = cv.get_view_func(
|
||||
cv.VIEW_AUTO,
|
||||
flow.ODictCaseless(),
|
||||
"foo"
|
||||
)
|
||||
assert f is cv.view_raw
|
||||
|
||||
f = cv.get_view_func(
|
||||
cv.VIEW_AUTO,
|
||||
flow.ODictCaseless(
|
||||
[["content-type", "text/html"]],
|
||||
),
|
||||
"foo"
|
||||
)
|
||||
assert f is cv.view_html
|
||||
|
||||
f = cv.get_view_func(
|
||||
cv.VIEW_AUTO,
|
||||
flow.ODictCaseless(
|
||||
[["content-type", "text/flibble"]],
|
||||
),
|
||||
"foo"
|
||||
)
|
||||
assert f is cv.view_raw
|
||||
|
||||
f = cv.get_view_func(
|
||||
cv.VIEW_AUTO,
|
||||
flow.ODictCaseless(
|
||||
[["content-type", "text/flibble"]],
|
||||
),
|
||||
"<xml></xml>"
|
||||
)
|
||||
assert f is cv.view_xml
|
||||
|
||||
def test_view_urlencoded(self):
|
||||
d = utils.urlencode([("one", "two"), ("three", "four")])
|
||||
assert cv.view_urlencoded([], d, 100)
|
||||
assert not cv.view_urlencoded([], "foo", 100)
|
||||
|
||||
def test_view_html(self):
|
||||
s = "<html><br><br></br><p>one</p></html>"
|
||||
assert cv.view_html([], s, 1000)
|
||||
|
||||
s = "gobbledygook"
|
||||
assert not cv.view_html([], s, 1000)
|
||||
|
||||
def test_view_json(self):
|
||||
cv.VIEW_CUTOFF = 100
|
||||
assert cv.view_json([], "{}", 1000)
|
||||
assert not cv.view_json([], "{", 1000)
|
||||
assert cv.view_json([], "[" + ",".join(["0"]*cv.VIEW_CUTOFF) + "]", 1000)
|
||||
assert cv.view_json([], "[1, 2, 3, 4, 5]", 5)
|
||||
|
||||
def test_view_xml(self):
|
||||
assert cv.view_xml([], "<foo></foo>", 1000)
|
||||
assert not cv.view_xml([], "<foo>", 1000)
|
||||
s = """<?xml version="1.0" encoding="UTF-8"?>
|
||||
<?xml-stylesheet title="XSL_formatting"?>
|
||||
<rss
|
||||
xmlns:media="http://search.yahoo.com/mrss/"
|
||||
xmlns:atom="http://www.w3.org/2005/Atom"
|
||||
version="2.0">
|
||||
</rss>
|
||||
"""
|
||||
assert cv.view_xml([], s, 1000)
|
||||
|
||||
def test_view_raw(self):
|
||||
assert cv.view_raw([], "foo", 1000)
|
||||
|
||||
def test_view_javascript(self):
|
||||
assert cv.view_javascript([], "[1, 2, 3]", 100)
|
||||
assert cv.view_javascript([], "[1, 2, 3", 100)
|
||||
assert cv.view_javascript([], "function(a){[1, 2, 3]}", 100)
|
||||
|
||||
def test_view_hex(self):
|
||||
assert cv.view_hex([], "foo", 1000)
|
||||
|
||||
def test_view_image(self):
|
||||
assert cv.view_image([], file("data/image.png").read(), sys.maxint)
|
||||
assert cv.view_image([], file("data/image.gif").read(), sys.maxint)
|
||||
assert cv.view_image([], file("data/image-err1.jpg").read(), sys.maxint)
|
||||
assert cv.view_image([], file("data/image.ico").read(), sys.maxint)
|
||||
assert not cv.view_image([], "flibble", sys.maxint)
|
||||
|
||||
def test_view_multipart(self):
|
||||
v = """
|
||||
--AaB03x
|
||||
Content-Disposition: form-data; name="submit-name"
|
||||
|
||||
Larry
|
||||
--AaB03x
|
||||
""".strip()
|
||||
h = flow.ODictCaseless(
|
||||
[("Content-Type", "multipart/form-data; boundary=AaB03x")]
|
||||
)
|
||||
assert cv.view_multipart(h, v, 1000)
|
||||
|
||||
h = flow.ODictCaseless()
|
||||
assert not cv.view_multipart(h, v, 1000)
|
||||
|
||||
h = flow.ODictCaseless(
|
||||
[("Content-Type", "multipart/form-data")]
|
||||
)
|
||||
assert not cv.view_multipart(h, v, 1000)
|
||||
|
||||
h = flow.ODictCaseless(
|
||||
[("Content-Type", "unparseable")]
|
||||
)
|
||||
assert not cv.view_multipart(h, v, 1000)
|
||||
|
||||
def test_get_content_view(self):
|
||||
r = cv.get_content_view(
|
||||
cv.VIEW_RAW,
|
||||
[["content-type", "application/json"]],
|
||||
"[1, 2, 3]",
|
||||
1000
|
||||
)
|
||||
assert "Raw" in r[0]
|
||||
|
||||
r = cv.get_content_view(
|
||||
cv.VIEW_AUTO,
|
||||
[["content-type", "application/json"]],
|
||||
"[1, 2, 3]",
|
||||
1000
|
||||
)
|
||||
assert r[0] == "JSON"
|
||||
|
||||
r = cv.get_content_view(
|
||||
cv.VIEW_AUTO,
|
||||
[["content-type", "application/json"]],
|
||||
"[1, 2",
|
||||
1000
|
||||
)
|
||||
assert "Raw" in r[0]
|
||||
|
||||
r = cv.get_content_view(
|
||||
cv.VIEW_AUTO,
|
||||
[
|
||||
["content-type", "application/json"],
|
||||
["content-encoding", "gzip"]
|
||||
],
|
||||
encoding.encode('gzip', "[1, 2, 3]"),
|
||||
1000
|
||||
)
|
||||
assert "decoded gzip" in r[0]
|
||||
assert "JSON" in r[0]
|
||||
|
||||
r = cv.get_content_view(
|
||||
cv.VIEW_XML,
|
||||
[
|
||||
["content-type", "application/json"],
|
||||
["content-encoding", "gzip"]
|
||||
],
|
||||
encoding.encode('gzip', "[1, 2, 3]"),
|
||||
1000
|
||||
)
|
||||
assert "decoded gzip" in r[0]
|
||||
assert "Raw" in r[0]
|
||||
|
||||
|
||||
tests = [
|
||||
uContentView()
|
||||
]
|
||||
@@ -27,8 +27,9 @@ class uDumpMaster(libpry.AutoTree):
|
||||
resp.content = content
|
||||
m.handle_clientconnect(cc)
|
||||
m.handle_request(req)
|
||||
m.handle_response(resp)
|
||||
f = m.handle_response(resp)
|
||||
m.handle_clientdisconnect(flow.ClientDisconnect(cc))
|
||||
return f
|
||||
|
||||
def _dummy_cycle(self, n, filt, content, **options):
|
||||
cs = StringIO()
|
||||
@@ -93,6 +94,12 @@ class uDumpMaster(libpry.AutoTree):
|
||||
def test_filter(self):
|
||||
assert not "GET" in self._dummy_cycle(1, "~u foo", "", verbosity=1)
|
||||
|
||||
def test_replacements(self):
|
||||
o = dump.Options(replacements=[(".*", "content", "foo")])
|
||||
m = dump.DumpMaster(None, o, None)
|
||||
f = self._cycle(m, "content")
|
||||
assert f.request.content == "foo"
|
||||
|
||||
def test_basic(self):
|
||||
for i in (1, 2, 3):
|
||||
assert "GET" in self._dummy_cycle(1, "~s", "", verbosity=i, eventlog=True)
|
||||
|
||||
@@ -98,7 +98,8 @@ class uMatching(libpry.AutoTree):
|
||||
200,
|
||||
"message",
|
||||
headers,
|
||||
"content_response"
|
||||
"content_response",
|
||||
None
|
||||
)
|
||||
return f
|
||||
|
||||
|
||||
@@ -87,7 +87,7 @@ class uClientPlaybackState(libpry.AutoTree):
|
||||
|
||||
class uServerPlaybackState(libpry.AutoTree):
|
||||
def test_hash(self):
|
||||
s = flow.ServerPlaybackState(None, [], False)
|
||||
s = flow.ServerPlaybackState(None, [], False, False)
|
||||
r = tutils.tflow()
|
||||
r2 = tutils.tflow()
|
||||
|
||||
@@ -99,7 +99,7 @@ class uServerPlaybackState(libpry.AutoTree):
|
||||
assert s._hash(r) != s._hash(r2)
|
||||
|
||||
def test_headers(self):
|
||||
s = flow.ServerPlaybackState(["foo"], [], False)
|
||||
s = flow.ServerPlaybackState(["foo"], [], False, False)
|
||||
r = tutils.tflow_full()
|
||||
r.request.headers["foo"] = ["bar"]
|
||||
r2 = tutils.tflow_full()
|
||||
@@ -120,7 +120,7 @@ class uServerPlaybackState(libpry.AutoTree):
|
||||
r2 = tutils.tflow_full()
|
||||
r2.request.headers["key"] = ["two"]
|
||||
|
||||
s = flow.ServerPlaybackState(None, [r, r2], False)
|
||||
s = flow.ServerPlaybackState(None, [r, r2], False, False)
|
||||
assert s.count() == 2
|
||||
assert len(s.fmap.keys()) == 1
|
||||
|
||||
@@ -134,6 +134,18 @@ class uServerPlaybackState(libpry.AutoTree):
|
||||
|
||||
assert not s.next_flow(r)
|
||||
|
||||
def test_load_with_nopop(self):
|
||||
r = tutils.tflow_full()
|
||||
r.request.headers["key"] = ["one"]
|
||||
|
||||
r2 = tutils.tflow_full()
|
||||
r2.request.headers["key"] = ["two"]
|
||||
|
||||
s = flow.ServerPlaybackState(None, [r, r2], False, True)
|
||||
|
||||
assert s.count() == 2
|
||||
n = s.next_flow(r)
|
||||
assert s.count() == 2
|
||||
|
||||
class uFlow(libpry.AutoTree):
|
||||
def test_copy(self):
|
||||
@@ -266,6 +278,24 @@ class uFlow(libpry.AutoTree):
|
||||
f.replace("error", "bar")
|
||||
assert f.error.msg == "bar"
|
||||
|
||||
def test_replace_encoded(self):
|
||||
f = tutils.tflow_full()
|
||||
f.request.content = "afoob"
|
||||
f.request.encode("gzip")
|
||||
f.response.content = "afoob"
|
||||
f.response.encode("gzip")
|
||||
|
||||
f.replace("foo", "bar")
|
||||
|
||||
assert f.request.content != "abarb"
|
||||
f.request.decode()
|
||||
assert f.request.content == "abarb"
|
||||
|
||||
assert f.response.content != "abarb"
|
||||
f.response.decode()
|
||||
assert f.response.content == "abarb"
|
||||
|
||||
|
||||
|
||||
class uState(libpry.AutoTree):
|
||||
def test_backup(self):
|
||||
@@ -547,7 +577,7 @@ class uFlowMaster(libpry.AutoTree):
|
||||
f = tutils.tflow_full()
|
||||
pb = [tutils.tflow_full(), f]
|
||||
fm = flow.FlowMaster(None, s)
|
||||
assert not fm.start_server_playback(pb, False, [], False)
|
||||
assert not fm.start_server_playback(pb, False, [], False, False)
|
||||
assert not fm.start_client_playback(pb, False)
|
||||
|
||||
q = Queue.Queue()
|
||||
@@ -568,10 +598,10 @@ class uFlowMaster(libpry.AutoTree):
|
||||
fm.refresh_server_playback = True
|
||||
assert not fm.do_server_playback(tutils.tflow())
|
||||
|
||||
fm.start_server_playback(pb, False, [], False)
|
||||
fm.start_server_playback(pb, False, [], False, False)
|
||||
assert fm.do_server_playback(tutils.tflow())
|
||||
|
||||
fm.start_server_playback(pb, False, [], True)
|
||||
fm.start_server_playback(pb, False, [], True, False)
|
||||
r = tutils.tflow()
|
||||
r.request.content = "gibble"
|
||||
assert not fm.do_server_playback(r)
|
||||
@@ -766,7 +796,7 @@ class uResponse(libpry.AutoTree):
|
||||
h["test"] = ["test"]
|
||||
c = flow.ClientConnect(("addr", 2222))
|
||||
req = flow.Request(c, "host", 22, "https", "GET", "/", h, "content")
|
||||
resp = flow.Response(req, 200, "msg", h.copy(), "content")
|
||||
resp = flow.Response(req, 200, "msg", h.copy(), "content", None)
|
||||
assert resp._assemble()
|
||||
|
||||
resp2 = resp.copy()
|
||||
@@ -805,18 +835,25 @@ class uResponse(libpry.AutoTree):
|
||||
c = "MOO=BAR; Expires=Tue, 08-Mar-2011 00:20:38 GMT; Path=foo.com; Secure"
|
||||
assert "00:21:38" in r._refresh_cookie(c, 60)
|
||||
|
||||
def test_get_cert(self):
|
||||
req = tutils.treq()
|
||||
resp = flow.Response(req, 200, "msg", flow.ODictCaseless(), "content", file("data/dercert").read())
|
||||
assert resp.get_cert()
|
||||
|
||||
resp = tutils.tresp()
|
||||
assert not resp.get_cert()
|
||||
|
||||
def test_getset_state(self):
|
||||
h = flow.ODictCaseless()
|
||||
h["test"] = ["test"]
|
||||
c = flow.ClientConnect(("addr", 2222))
|
||||
req = flow.Request(c, "host", 22, "https", "GET", "/", h, "content")
|
||||
resp = flow.Response(req, 200, "msg", h.copy(), "content")
|
||||
resp = flow.Response(req, 200, "msg", h.copy(), "content", None)
|
||||
|
||||
state = resp._get_state()
|
||||
assert flow.Response._from_state(req, state) == resp
|
||||
|
||||
resp2 = flow.Response(req, 220, "foo", h.copy(), "test")
|
||||
resp2 = flow.Response(req, 220, "foo", h.copy(), "test", None)
|
||||
assert not resp == resp2
|
||||
resp._load_state(resp2._get_state())
|
||||
assert resp == resp2
|
||||
@@ -886,8 +923,9 @@ class uClientConnect(libpry.AutoTree):
|
||||
c2 = flow.ClientConnect(("a", 25))
|
||||
assert not c == c2
|
||||
|
||||
c2.requestcount = 99
|
||||
c._load_state(c2._get_state())
|
||||
assert c == c2
|
||||
assert c.requestcount == 99
|
||||
|
||||
c3 = c.copy()
|
||||
assert c3 == c
|
||||
@@ -971,6 +1009,11 @@ class uODict(libpry.AutoTree):
|
||||
["two", "vun"],
|
||||
]
|
||||
|
||||
def test_get(self):
|
||||
self.od.add("one", "two")
|
||||
assert self.od.get("one") == ["two"]
|
||||
assert self.od.get("two") == None
|
||||
|
||||
|
||||
class uODictCaseless(libpry.AutoTree):
|
||||
def setUp(self):
|
||||
@@ -984,8 +1027,76 @@ class uODictCaseless(libpry.AutoTree):
|
||||
assert len(self.od) == 1
|
||||
|
||||
|
||||
class udecoded(libpry.AutoTree):
|
||||
def test_del(self):
|
||||
r = tutils.treq()
|
||||
assert r.content == "content"
|
||||
assert not r.headers["content-encoding"]
|
||||
r.encode("gzip")
|
||||
assert r.headers["content-encoding"]
|
||||
assert r.content != "content"
|
||||
with flow.decoded(r):
|
||||
assert not r.headers["content-encoding"]
|
||||
assert r.content == "content"
|
||||
assert r.headers["content-encoding"]
|
||||
assert r.content != "content"
|
||||
|
||||
with flow.decoded(r):
|
||||
r.content = "foo"
|
||||
|
||||
assert r.content != "foo"
|
||||
r.decode()
|
||||
assert r.content == "foo"
|
||||
|
||||
|
||||
class uReplaceHooks(libpry.AutoTree):
|
||||
def test_add_remove(self):
|
||||
h = flow.ReplaceHooks()
|
||||
h.add("~q", "foo", "bar")
|
||||
assert h.lst
|
||||
h.remove("~q", "foo", "bar")
|
||||
assert not h.lst
|
||||
|
||||
h.add("~q", "foo", "bar")
|
||||
h.add("~s", "foo", "bar")
|
||||
|
||||
v = h.get_specs()
|
||||
assert v == [('~q', 'foo', 'bar'), ('~s', 'foo', 'bar')]
|
||||
assert h.count() == 2
|
||||
h.remove("~q", "foo", "bar")
|
||||
assert h.count() == 1
|
||||
h.remove("~q", "foo", "bar")
|
||||
assert h.count() == 1
|
||||
h.clear()
|
||||
assert h.count() == 0
|
||||
|
||||
f = tutils.tflow()
|
||||
f.request.content = "foo"
|
||||
h.add("~s", "foo", "bar")
|
||||
h.run(f)
|
||||
assert f.request.content == "foo"
|
||||
|
||||
f = tutils.tflow_full()
|
||||
f.request.content = "foo"
|
||||
f.response.content = "foo"
|
||||
h.run(f)
|
||||
assert f.response.content == "bar"
|
||||
assert f.request.content == "foo"
|
||||
|
||||
f = tutils.tflow()
|
||||
h.clear()
|
||||
h.add("~q", "foo", "bar")
|
||||
f.request.content = "foo"
|
||||
h.run(f)
|
||||
assert f.request.content == "bar"
|
||||
|
||||
assert not h.add("~", "foo", "bar")
|
||||
assert not h.add("foo", "*", "bar")
|
||||
|
||||
|
||||
|
||||
tests = [
|
||||
uReplaceHooks(),
|
||||
uStickyCookieState(),
|
||||
uStickyAuthState(),
|
||||
uServerPlaybackState(),
|
||||
@@ -1000,4 +1111,5 @@ tests = [
|
||||
uClientConnect(),
|
||||
uODict(),
|
||||
uODictCaseless(),
|
||||
udecoded()
|
||||
]
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import textwrap, os, re, json
|
||||
import textwrap, re, json
|
||||
import libpry
|
||||
from libmproxy import utils
|
||||
|
||||
@@ -54,66 +54,10 @@ class upretty_size(libpry.AutoTree):
|
||||
|
||||
class uData(libpry.AutoTree):
|
||||
def test_nonexistent(self):
|
||||
assert utils.pkg_data.path("console")
|
||||
libpry.raises("does not exist", utils.pkg_data.path, "nonexistent")
|
||||
|
||||
|
||||
|
||||
class upretty_xmlish(libpry.AutoTree):
|
||||
def test_tagre(self):
|
||||
def f(s):
|
||||
return re.search(utils.TAG, s, re.VERBOSE|re.MULTILINE)
|
||||
assert f(r"<body>")
|
||||
assert f(r"<body/>")
|
||||
assert f(r"< body/>")
|
||||
assert f(r"< body/ >")
|
||||
assert f(r"< body / >")
|
||||
assert f(r"<foo a=b>")
|
||||
assert f(r"<foo a='b'>")
|
||||
assert f(r"<foo a='b\"'>")
|
||||
assert f(r'<a b=(a.b) href="foo">')
|
||||
assert f('<td width=25%>')
|
||||
assert f('<form name="search" action="/search.php" method="get" accept-charset="utf-8" class="search">')
|
||||
assert f('<img src="gif" width="125" height="16" alt="" />')
|
||||
|
||||
|
||||
def test_all(self):
|
||||
def isbalanced(ret):
|
||||
# The last tag should have no indent
|
||||
assert ret[-1].strip() == ret[-1]
|
||||
|
||||
s = "<html><br><br></br><p>one</p></html>"
|
||||
ret = utils.pretty_xmlish(s)
|
||||
isbalanced(ret)
|
||||
|
||||
s = r"""
|
||||
<body bgcolor=#ffffff text=#000000 link=#0000cc vlink=#551a8b alink=#ff0000 onload="document.f.q.focus();if(document.images)new Image().src='/images/srpr/nav_logo27.png'" ><textarea id=csi style=display:none></textarea></body>
|
||||
"""
|
||||
isbalanced(utils.pretty_xmlish(textwrap.dedent(s)))
|
||||
|
||||
s = r"""
|
||||
<a href="http://foo.com" target="">
|
||||
<img src="http://foo.gif" alt="bar" height="25" width="132">
|
||||
</a>
|
||||
"""
|
||||
isbalanced(utils.pretty_xmlish(textwrap.dedent(s)))
|
||||
|
||||
s = r"""
|
||||
<!DOCTYPE html PUBLIC \"-//W3C//DTD XHTML 1.0 Strict//EN\"
|
||||
\"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd\">
|
||||
<html></html>
|
||||
"""
|
||||
ret = utils.pretty_xmlish(textwrap.dedent(s))
|
||||
isbalanced(ret)
|
||||
|
||||
s = "<html><br/><p>one</p></html>"
|
||||
ret = utils.pretty_xmlish(s)
|
||||
assert len(ret) == 6
|
||||
isbalanced(ret)
|
||||
|
||||
s = "gobbledygook"
|
||||
assert utils.pretty_xmlish(s) == ["gobbledygook"]
|
||||
|
||||
|
||||
class upretty_json(libpry.AutoTree):
|
||||
def test_one(self):
|
||||
s = json.dumps({"foo": 1})
|
||||
@@ -127,48 +71,6 @@ class u_urldecode(libpry.AutoTree):
|
||||
assert len(utils.urldecode(s)) == 2
|
||||
|
||||
|
||||
class udummy_ca(libpry.AutoTree):
|
||||
def test_all(self):
|
||||
d = self.tmpdir()
|
||||
path = os.path.join(d, "foo/cert.cnf")
|
||||
assert utils.dummy_ca(path)
|
||||
assert os.path.exists(path)
|
||||
|
||||
path = os.path.join(d, "foo/cert2.pem")
|
||||
assert utils.dummy_ca(path)
|
||||
assert os.path.exists(path)
|
||||
assert os.path.exists(os.path.join(d, "foo/cert2-cert.pem"))
|
||||
assert os.path.exists(os.path.join(d, "foo/cert2-cert.p12"))
|
||||
|
||||
|
||||
class udummy_cert(libpry.AutoTree):
|
||||
def test_with_ca(self):
|
||||
d = self.tmpdir()
|
||||
cacert = os.path.join(d, "foo/cert.cnf")
|
||||
assert utils.dummy_ca(cacert)
|
||||
p = utils.dummy_cert(
|
||||
os.path.join(d, "foo"),
|
||||
cacert,
|
||||
"foo.com"
|
||||
)
|
||||
assert os.path.exists(p)
|
||||
# Short-circuit
|
||||
assert utils.dummy_cert(
|
||||
os.path.join(d, "foo"),
|
||||
cacert,
|
||||
"foo.com"
|
||||
)
|
||||
|
||||
def test_no_ca(self):
|
||||
d = self.tmpdir()
|
||||
p = utils.dummy_cert(
|
||||
d,
|
||||
None,
|
||||
"foo.com"
|
||||
)
|
||||
assert os.path.exists(p)
|
||||
|
||||
|
||||
class uLRUCache(libpry.AutoTree):
|
||||
def test_one(self):
|
||||
class Foo:
|
||||
@@ -255,19 +157,37 @@ class u_parse_size(libpry.AutoTree):
|
||||
libpry.raises(ValueError, utils.parse_size, "ak")
|
||||
|
||||
|
||||
class u_parse_content_type(libpry.AutoTree):
|
||||
def test_simple(self):
|
||||
p = utils.parse_content_type
|
||||
assert p("text/html") == ("text", "html", {})
|
||||
assert p("text") == None
|
||||
|
||||
v = p("text/html; charset=UTF-8")
|
||||
assert v == ('text', 'html', {'charset': 'UTF-8'})
|
||||
|
||||
|
||||
class u_cleanBin(libpry.AutoTree):
|
||||
def test_simple(self):
|
||||
assert utils.cleanBin("one") == "one"
|
||||
assert utils.cleanBin("\00ne") == ".ne"
|
||||
assert utils.cleanBin("\nne") == "\nne"
|
||||
assert utils.cleanBin("\nne", True) == ".ne"
|
||||
|
||||
|
||||
|
||||
tests = [
|
||||
u_cleanBin(),
|
||||
u_parse_content_type(),
|
||||
uformat_timestamp(),
|
||||
uisBin(),
|
||||
uisXML(),
|
||||
uhexdump(),
|
||||
upretty_size(),
|
||||
uData(),
|
||||
upretty_xmlish(),
|
||||
upretty_json(),
|
||||
u_urldecode(),
|
||||
udel_all(),
|
||||
udummy_ca(),
|
||||
udummy_cert(),
|
||||
uLRUCache(),
|
||||
u_parse_url(),
|
||||
u_parse_proxy_spec(),
|
||||
|
||||
9
test/tools/getcert
Executable file
@@ -0,0 +1,9 @@
|
||||
#!/usr/bin/env python
|
||||
import sys
|
||||
sys.path.insert(0, "../..")
|
||||
import socket, tempfile, ssl, subprocess
|
||||
|
||||
addr = socket.gethostbyname(sys.argv[1])
|
||||
print ssl.get_server_certificate((addr, 443))
|
||||
|
||||
|
||||
17
test/tools/getcn
Executable file
@@ -0,0 +1,17 @@
|
||||
#!/usr/bin/env python
|
||||
import sys
|
||||
sys.path.insert(0, "../../")
|
||||
from libmproxy import certutils
|
||||
|
||||
if len(sys.argv) > 2:
|
||||
port = int(sys.argv[2])
|
||||
else:
|
||||
pport = 443
|
||||
|
||||
cn, san = certutils.get_remote_cn(sys.argv[1], port)
|
||||
print cn
|
||||
if san:
|
||||
for i in san:
|
||||
print "\t", i
|
||||
|
||||
|
||||
@@ -17,7 +17,7 @@ def tresp(req=None):
|
||||
req = treq()
|
||||
headers = flow.ODictCaseless()
|
||||
headers["header_response"] = ["svalue"]
|
||||
return flow.Response(req, 200, "message", headers, "content_response")
|
||||
return flow.Response(req, 200, "message", headers, "content_response", None)
|
||||
|
||||
|
||||
def tflow():
|
||||
|
||||
13
todo
@@ -2,20 +2,21 @@ This is a loose collection of todo items, in case someone else wants to start
|
||||
hacking on mitmproxy. Drop me a line (aldo@corte.si) if you want to tackle any
|
||||
of these and need some pointers.
|
||||
|
||||
Targeted for 0.8:
|
||||
Targeted for 0.9:
|
||||
- White-background colorscheme
|
||||
- Extra content view modules: CSS indenter, Flash SWF info extractor
|
||||
- Upstream proxy support.
|
||||
- Improve worst-case performance problem with XML-ish indenter
|
||||
- Follow mode to keep most recent flow in view
|
||||
- Rewrite the core to be asynchronous. I've done some research, and
|
||||
although it's a bit of a bloated monster, it looks like Twisted is the way
|
||||
to go.
|
||||
- Verbose view to show timestamps
|
||||
- Search within requests/responses
|
||||
- Transparent proxy support
|
||||
- Ordering a-la mutt's "o" shortcut
|
||||
|
||||
|
||||
Further ideas:
|
||||
Future:
|
||||
- Rewrite the core to be asynchronous. I've done some research, and
|
||||
although it's a bit of a bloated monster, it looks like Twisted is the way
|
||||
to go.
|
||||
- Add some "workspace" features to mitmproxy:
|
||||
- Flow comments
|
||||
- Copying/duplicating flows
|
||||
|
||||