Compare commits

..

8 Commits

Author SHA1 Message Date
Maximilian Hils
14b33dca5d bump version 2017-04-28 14:21:18 +02:00
Maximilian Hils
160a225218 update CHANGELOG 2017-04-28 14:21:11 +02:00
Thomas Kriechbaumer
2ba3f41b04 http2: normalize headers before sending 2017-04-28 14:19:28 +02:00
Maximilian Hils
c1743e169b fix mitmweb's content security policy 2017-04-28 14:16:17 +02:00
Thomas Kriechbaumer
5e99691e2c bump version to 2.0.1 2017-03-15 19:44:08 +01:00
Thomas Kriechbaumer
c3fa3acd95 http2: fix h2 header normalization test (#2080) 2017-03-15 19:24:08 +01:00
Thomas Kriechbaumer
c6c3b8f447 bump cryptography 2017-03-15 19:05:11 +01:00
Thomas Kriechbaumer
ac871b5874 bump pyparsing 2017-03-15 19:04:59 +01:00
365 changed files with 25179 additions and 31964 deletions

2
.gitignore vendored
View File

@@ -11,7 +11,6 @@ MANIFEST
.cache/
.tox*/
build/
mitmproxy/contrib/kaitaistruct/*.ksy
# UI
@@ -22,4 +21,3 @@ sslkeylogfile.log
.tox/
.python-version
coverage.xml
web/coverage/

View File

@@ -47,19 +47,6 @@ matrix:
env: TOXENV=individual_coverage
- python: 3.5
env: TOXENV=docs
- language: node_js
node_js: "node"
before_install:
- curl -o- -L https://yarnpkg.com/install.sh | bash
- export PATH=$HOME/.yarn/bin:$PATH
install:
- cd web && yarn
- yarn global add codecov
script: npm test && codecov
cache:
yarn: true
directories:
- web/node_modules
install:
- |

View File

@@ -62,7 +62,7 @@ Development Setup
To get started hacking on mitmproxy, please follow the `advanced installation`_ steps to install mitmproxy from source, but stop right before running ``pip3 install mitmproxy``. Instead, do the following:
.. code-block:: bash
.. code-block:: text
git clone https://github.com/mitmproxy/mitmproxy.git
cd mitmproxy
@@ -80,7 +80,7 @@ The main executables for the project - ``mitmdump``, ``mitmproxy``,
virtualenv. After activating the virtualenv, they will be on your $PATH, and
you can run them like any other command:
.. code-block:: bash
.. code-block:: text
. venv/bin/activate # "venv\Scripts\activate" on Windows
mitmdump --version
@@ -91,13 +91,13 @@ Testing
If you've followed the procedure above, you already have all the development
requirements installed, and you can run the full test suite (including tests for code style and documentation) with tox_:
.. code-block:: bash
.. code-block:: text
tox
For speedier testing, we recommend you run `pytest`_ directly on individual test files or folders:
.. code-block:: bash
.. code-block:: text
cd test/mitmproxy/addons
pytest --cov mitmproxy.addons.anticache --looponfail test_anticache.py
@@ -114,7 +114,7 @@ The mitmproxy documentation is build using Sphinx_, which is installed
automatically if you set up a development environment as described above. After
installation, you can render the documentation like this:
.. code-block:: bash
.. code-block:: text
cd docs
make clean
@@ -136,7 +136,7 @@ This is automatically enforced on every PR. If we detect a linting error, the
PR checks will fail and block merging. You can run our lint checks yourself
with the following command:
.. code-block:: bash
.. code-block:: text
tox -e lint

View File

@@ -24,9 +24,6 @@ something like this:
Click on the relevant icon, follow the setup instructions for the platform
you're on and you are good to go.
For iOS version 10.3 or up, you need to make sure ``mitmproxy`` is enabled in
``Certificate Trust Settings``, you can check it by going to
``Settings > General > About > Certificate Trust Settings``.
Installing the mitmproxy CA certificate manually
------------------------------------------------
@@ -45,7 +42,7 @@ iOS
See http://jasdev.me/intercepting-ios-traffic
and https://web.archive.org/web/20150920082614/http://kb.mit.edu/confluence/pages/viewpage.action?pageId=152600377
and http://web.archive.org/web/20150920082614/http://kb.mit.edu/confluence/pages/viewpage.action?pageId=152600377
iOS Simulator
^^^^^^^^^^^^^
@@ -55,7 +52,7 @@ See https://github.com/ADVTOOLS/ADVTrustStore#how-to-use-advtruststore
Java
^^^^
See https://docs.oracle.com/cd/E19906-01/820-4916/geygn/index.html
See http://docs.oracle.com/cd/E19906-01/820-4916/geygn/index.html
Android/Android Simulator
^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -65,7 +62,7 @@ See http://wiki.cacert.org/FAQ/ImportRootCert#Android_Phones_.26_Tablets
Windows
^^^^^^^
See https://web.archive.org/web/20160612045445/http://windows.microsoft.com/en-ca/windows/import-export-certificates-private-keys#1TC=windows-7
See http://windows.microsoft.com/en-ca/windows/import-export-certificates-private-keys#1TC=windows-7
Windows (automated)
^^^^^^^^^^^^^^^^^^^
@@ -82,7 +79,7 @@ See https://support.apple.com/kb/PH7297?locale=en_US
Ubuntu/Debian
^^^^^^^^^^^^^
See https://askubuntu.com/questions/73287/how-do-i-install-a-root-certificate/94861#94861
See http://askubuntu.com/questions/73287/how-do-i-install-a-root-certificate/94861#94861
Mozilla Firefox
^^^^^^^^^^^^^^^
@@ -92,7 +89,7 @@ See https://wiki.mozilla.org/MozillaRootCertificate#Mozilla_Firefox
Chrome on Linux
^^^^^^^^^^^^^^^
See https://stackoverflow.com/a/15076602/198996
See https://code.google.com/p/chromium/wiki/LinuxCertManagement
The mitmproxy certificate authority
@@ -135,7 +132,7 @@ mitmproxy-ca-cert.cer Same file as .pem, but with an extension expected by some
Using a custom certificate
--------------------------
You can use your own (leaf) certificate by passing the ``--cert [domain=]path_to_certificate`` option to
You can use your own certificate by passing the ``--cert [domain=]path_to_certificate`` option to
mitmproxy. Mitmproxy then uses the provided certificate for interception of the
specified domain instead of generating a certificate signed by its own CA.
@@ -208,4 +205,4 @@ directory and uses this as the client cert.
.. _Certificate Pinning: https://security.stackexchange.com/questions/29988/what-is-certificate-pinning/
.. _Certificate Pinning: http://security.stackexchange.com/questions/29988/what-is-certificate-pinning/

View File

@@ -11,5 +11,5 @@ sure you capture an HTTP exchange in its totality. It's also often used during
================== ======================
command-line ``--anticache``
mitmproxy shortcut :kbd:`O` then :kbd:`a`
mitmproxy shortcut :kbd:`o` then :kbd:`a`
================== ======================

View File

@@ -23,7 +23,7 @@ How it works
================== ======================
command-line ``--ignore regex``
mitmproxy shortcut :kbd:`O` then :kbd:`I`
mitmproxy shortcut :kbd:`o` then :kbd:`I`
================== ======================

View File

@@ -48,24 +48,25 @@ In practice, it's pretty common for the replacement literal to be long and
complex. For instance, it might be an XSS exploit that weighs in at hundreds or
thousands of characters. To cope with this, there's a variation of the
replacement hook specifier that lets you load the replacement text from a file.
To specify a file as replacement, prefix the file path with ``@``.
You might start **mitmdump** as follows:
So, you might start **mitmdump** as follows:
>>> mitmdump --replacements :~q:foo:@~/xss-exploit
>>> mitmdump --replace-from-file :~q:foo:~/xss-exploit
This will load the replacement text from the file ``~/xss-exploit``.
The ``--replacements`` flag can be passed multiple times.
Both the ``--replace`` and ``--replace-from-file`` flags can be passed multiple
times.
Interactively
-------------
The :kbd:`R` shortcut key in the mitmproxy options menu (:kbd:`O`) lets you add and edit
The :kbd:`R` shortcut key in the mitmproxy options menu (:kbd:`o`) lets you add and edit
replacement hooks using a built-in editor. The context-sensitive help (:kbd:`?`) has
complete usage information.
================== =======================
command-line ``--replacements``
mitmproxy shortcut :kbd:`O` then :kbd:`R`
command-line ``--replace``,
``--replace-from-file``
mitmproxy shortcut :kbd:`o` then :kbd:`R`
================== =======================

View File

@@ -31,8 +31,7 @@ Host Header
In reverse proxy mode, mitmproxy automatically rewrites the Host header to match the
upstream server. This allows mitmproxy to easily connect to existing endpoints on the
open web (e.g. ``mitmproxy -R https://example.com``). You can disable this behaviour
by passing ``--keep-host-header`` on the console.
open web (e.g. ``mitmproxy -R https://example.com``).
However, keep in mind that absolute URLs within the returned document or HTTP redirects will
NOT be rewritten by mitmproxy. This means that if you click on a link for "http://example.com"
@@ -40,4 +39,4 @@ in the returned web page, you will be taken directly to that URL, bypassing mitm
One possible way to address this is to modify the hosts file of your OS so that "example.com"
resolves to your proxy's IP, and then access the proxy by going directly to example.com.
Make sure that your proxy can still resolve the original IP, or specify an IP in mitmproxy.
Make sure that your proxy can still resolve the original IP, or specify an IP in mitmproxy.

View File

@@ -31,7 +31,7 @@ in the past at the time of replay, and vice versa. Cookie expiry times are
updated in a similar way.
You can turn off response refreshing using the ``--norefresh`` argument, or using
the :kbd:`O` options shortcut within :program:`mitmproxy`.
the :kbd:`o` options shortcut within :program:`mitmproxy`.
Replaying a session recorded in Reverse-proxy Mode

View File

@@ -15,5 +15,5 @@ Example: Set the **Host** header to "example.com" for all requests.
================== =======================
command-line ``--setheader PATTERN``
mitmproxy shortcut :kbd:`O` then :kbd:`H`
mitmproxy shortcut :kbd:`o` then :kbd:`H`
================== =======================

View File

@@ -22,7 +22,7 @@ to interact with the secured resources.
================== ======================
command-line ``-t FILTER``
mitmproxy shortcut :kbd:`O` then :kbd:`t`
mitmproxy shortcut :kbd:`o` then :kbd:`t`
================== ======================
@@ -37,5 +37,5 @@ replay of HTTP Digest authentication.
================== ======================
command-line ``-u FILTER``
mitmproxy shortcut :kbd:`O` then :kbd:`A`
mitmproxy shortcut :kbd:`o` then :kbd:`A`
================== ======================

View File

@@ -19,7 +19,7 @@ How it works
================== ======================
command-line ``--tcp HOST``
mitmproxy shortcut :kbd:`O` then :kbd:`T`
mitmproxy shortcut :kbd:`o` then :kbd:`T`
================== ======================
For a detailed description how the hostname pattern works, please look at the :ref:`passthrough`

View File

@@ -19,5 +19,5 @@ Upstream cert sniffing is on by default, and can optionally be turned off.
================== ======================
command-line ``--no-upstream-cert``
mitmproxy shortcut :kbd:`O` then :kbd:`U`
mitmproxy shortcut :kbd:`o` then :kbd:`U`
================== ======================

View File

@@ -43,7 +43,7 @@ client connects to the proxy and makes a request that looks like this:
CONNECT example.com:443 HTTP/1.1
A conventional proxy can neither view nor manipulate a TLS-encrypted data
A conventional proxy can neither view nor manipulate an TLS-encrypted data
stream, so a CONNECT request simply asks the proxy to open a pipe between the
client and server. The proxy here is just a facilitator - it blindly forwards
data in both directions without knowing anything about the contents. The
@@ -63,7 +63,7 @@ exactly this attack, by allowing a trusted third-party to cryptographically sign
a server's certificates to verify that they are legit. If this signature doesn't
match or is from a non-trusted party, a secure client will simply drop the
connection and refuse to proceed. Despite the many shortcomings of the CA system
as it exists today, this is usually fatal to attempts to MITM a TLS connection
as it exists today, this is usually fatal to attempts to MITM an TLS connection
for analysis. Our answer to this conundrum is to become a trusted Certificate
Authority ourselves. Mitmproxy includes a full CA implementation that generates
interception certificates on the fly. To get the client to trust these
@@ -143,7 +143,7 @@ Lets put all of this together into the complete explicitly proxied HTTPS flow.
2. Mitmproxy responds with a ``200 Connection Established``, as if it has set up the CONNECT pipe.
3. The client believes it's talking to the remote server, and initiates the TLS connection.
It uses SNI to indicate the hostname it is connecting to.
4. Mitmproxy connects to the server, and establishes a TLS connection using the SNI hostname
4. Mitmproxy connects to the server, and establishes an TLS connection using the SNI hostname
indicated by the client.
5. The server responds with the matching certificate, which contains the CN and SAN values
needed to generate the interception certificate.
@@ -217,7 +217,7 @@ explicit HTTPS connections to establish the CN and SANs, and cope with SNI.
destination was.
3. The client believes it's talking to the remote server, and initiates the TLS connection.
It uses SNI to indicate the hostname it is connecting to.
4. Mitmproxy connects to the server, and establishes a TLS connection using the SNI hostname
4. Mitmproxy connects to the server, and establishes an TLS connection using the SNI hostname
indicated by the client.
5. The server responds with the matching certificate, which contains the CN and SAN values
needed to generate the interception certificate.

View File

@@ -110,25 +110,11 @@ libraries. This was tested on a fully patched installation of Fedora 24.
Make sure to have an up-to-date version of pip by running ``pip3 install -U pip``.
.. _install-source-opensuse:
Installation from Source on openSUSE
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
This was tested on a fully patched installation of openSUSE Tumbleweed.
Please note that openSUSE Leap 42.2 only comes with Python 3.4.x, whereas mitmproxy requires Python 3.5 or above.
You can check you Python version by running ``python3 --version``.
.. code:: bash
sudo zypper install python3-pip python3-devel libffi-devel openssl-devel gcc-c++
sudo pip3 install mitmproxy
.. _install-source-windows:
Installation from Source on Windows
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
🐱💻 Installation from Source on Windows
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.. note::
Mitmproxy's console interface is not supported on Windows, but you can use

View File

@@ -66,7 +66,7 @@ At the moment, the Grid Editor is used in four parts of mitmproxy:
- Editing request or response headers (:kbd:`e` for edit, then :kbd:`h` for headers in flow view)
- Editing a query string (:kbd:`e` for edit, then :kbd:`q` for query in flow view)
- Editing a URL-encoded form (:kbd:`e` for edit, then :kbd:`f` for form in flow view)
- Editing replacement patterns (:kbd:`O` for options, then :kbd:`R` for Replacement Patterns)
- Editing replacement patterns (:kbd:`o` for options, then :kbd:`R` for Replacement Patterns)
If there is is no data, an empty editor will be started to let you add some.
Here is the editor showing the headers from a request:

View File

@@ -23,7 +23,7 @@ HTTP Request
mode if the server responds correctly. Apart from that, websocket
requests are just like any other, and all aspects of the request
can be over-ridden.
* - h\ :ref:`VALUE`\ =\ :ref:`VALUE`\
* - h\:\ :ref:`VALUE`\ =\ :ref:`VALUE`\
- Set a header.
* - r
- Set the **raw** flag on this response. Pathod will not calculate a
@@ -73,7 +73,7 @@ HTTP Response
* - m\ :ref:`VALUE`
- HTTP Reason message. Automatically chosen according to the response
code if not specified. (HTTP/1 only)
* - h\ :ref:`VALUE`\ =\ :ref:`VALUE`\
* - h\:\ :ref:`VALUE`\ =\ :ref:`VALUE`\
- Set a header.
* - r
- Set the **raw** flag on this response. Pathod will not calculate a

View File

@@ -29,12 +29,6 @@ will be added to all responses passing through the proxy:
>>> mitmdump -s add_header.py
Examples
--------
A collection of addons that demonstrate popular features can be found at :src:`examples/simple`.
Using classes
-------------
@@ -60,8 +54,24 @@ and is replaced by the class instance.
Handling arguments
------------------
Scripts can handle their own command-line arguments, just like any other Python
program. Let's build on the example above to do something slightly more
sophisticated - replace one value with another in all responses. Mitmproxy's
`HTTPRequest <api.html#mitmproxy.models.http.HTTPRequest>`_ and `HTTPResponse
<api.html#mitmproxy.models.http.HTTPResponse>`_ objects have a handy `replace
<api.html#mitmproxy.models.http.HTTPResponse.replace>`_ method that takes care
of all the details for us.
FIXME
.. literalinclude:: ../../examples/simple/script_arguments.py
:caption: :src:`examples/simple/script_arguments.py`
:language: python
We can now call this script on the command-line like this:
>>> mitmdump -dd -s "./script_arguments.py html faketml"
Whenever a handler is called, mitpmroxy rewrites the script environment so that
it sees its own arguments as if it was invoked from the command-line.
Logging and the context

View File

@@ -5,6 +5,7 @@
| change_upstream_proxy.py | Dynamically change the upstream proxy. |
| dns_spoofing.py | Use mitmproxy in a DNS spoofing scenario. |
| dup_and_replay.py | Duplicates each request, changes it, and then replays the modified request. |
| flowbasic.py | Basic use of mitmproxy's FlowMaster directly. |
| full_transparency_shim.c | Setuid wrapper that can be used to run mitmproxy in full transparency mode, as a normal user. |
| har_dump.py | Dump flows as HAR files. |
| mitmproxywrapper.py | Bracket mitmproxy run with proxy enable/disable on OS X |
@@ -15,4 +16,3 @@
| stream_modify.py | Modify a streamed response body. |
| tcp_message.py | Modify a raw TCP connection |
| tls_passthrough.py | Use conditional TLS interception based on a user-defined strategy. |
| xss_scanner.py | Scan all visited webpages. |

View File

@@ -1,12 +1,11 @@
"""
This script makes it possible to use mitmproxy in scenarios where IP spoofing
has been used to redirect connections to mitmproxy. The way this works is that
we rely on either the TLS Server Name Indication (SNI) or the Host header of the
HTTP request. Of course, this is not foolproof - if an HTTPS connection comes
without SNI, we don't know the actual target and cannot construct a certificate
that looks valid. Similarly, if there's no Host header or a spoofed Host header,
we're out of luck as well. Using transparent mode is the better option most of
the time.
This script makes it possible to use mitmproxy in scenarios where IP spoofing has been used to redirect
connections to mitmproxy. The way this works is that we rely on either the TLS Server Name Indication (SNI) or the
Host header of the HTTP request.
Of course, this is not foolproof - if an HTTPS connection comes without SNI, we don't
know the actual target and cannot construct a certificate that looks valid.
Similarly, if there's no Host header or a spoofed Host header, we're out of luck as well.
Using transparent mode is the better option most of the time.
Usage:
mitmproxy
@@ -14,8 +13,6 @@ Usage:
-s dns_spoofing.py
# Used as the target location if neither SNI nor host header are present.
-R http://example.com/
# To avoid auto rewriting of host header by the reverse proxy target.
--keep-host-header
mitmdump
-p 80
-R http://localhost:443/
@@ -32,6 +29,13 @@ parse_host_header = re.compile(r"^(?P<host>[^:]+|\[.+\])(?::(?P<port>\d+))?$")
class Rerouter:
def requestheaders(self, flow):
"""
The original host header is retrieved early
before flow.request is replaced by mitmproxy new outgoing request
"""
flow.metadata["original_host"] = flow.request.host_header
def request(self, flow):
if flow.client_conn.ssl_established:
flow.request.scheme = "https"
@@ -42,7 +46,7 @@ class Rerouter:
sni = None
port = 80
host_header = flow.request.host_header
host_header = flow.metadata["original_host"]
m = parse_host_header.match(host_header)
if m:
host_header = m.group("host").strip("[]")
@@ -54,4 +58,5 @@ class Rerouter:
flow.request.port = port
addons = [Rerouter()]
def start():
return Rerouter()

View File

@@ -0,0 +1,43 @@
#!/usr/bin/env python3
"""
This example shows how to build a proxy based on mitmproxy's Flow
primitives.
Heads Up: In the majority of cases, you want to use inline scripts.
Note that request and response messages are not automatically replied to,
so we need to implement handlers to do this.
"""
from mitmproxy import controller, options, master
from mitmproxy.proxy import ProxyServer, ProxyConfig
class MyMaster(master.Master):
def run(self):
try:
master.Master.run(self)
except KeyboardInterrupt:
self.shutdown()
@controller.handler
def request(self, f):
print("request", f)
@controller.handler
def response(self, f):
print("response", f)
@controller.handler
def error(self, f):
print("error", f)
@controller.handler
def log(self, l):
print("log", l.msg)
opts = options.Options(cadir="~/.mitmproxy/")
config = ProxyConfig(opts)
server = ProxyServer(config)
m = MyMaster(opts, server)
m.run()

View File

@@ -4,17 +4,17 @@ This inline script can be used to dump flows as HAR files.
import json
import sys
import base64
import zlib
import os
from datetime import datetime
from datetime import timezone
import pytz
import mitmproxy
from mitmproxy import version
from mitmproxy import ctx
from mitmproxy.utils import strutils
from mitmproxy.net.http import cookies
@@ -25,13 +25,17 @@ HAR = {}
SERVERS_SEEN = set()
def load(l):
l.add_option(
"hardump", str, "", "HAR dump path.",
)
def start():
"""
Called once on script startup before any other events.
"""
if len(sys.argv) != 2:
raise ValueError(
'Usage: -s "har_dump.py filename" '
'(- will output to stdout, filenames ending with .zhar '
'will result in compressed har)'
)
def configure(updated):
HAR.update({
"log": {
"version": "1.2",
@@ -85,7 +89,7 @@ def response(flow):
# Timings set to -1 will be ignored as per spec.
full_time = sum(v for v in timings.values() if v > -1)
started_date_time = datetime.fromtimestamp(flow.request.timestamp_start, timezone.utc).isoformat()
started_date_time = format_datetime(datetime.utcfromtimestamp(flow.request.timestamp_start))
# Response body size and encoding
response_body_size = len(flow.response.raw_content)
@@ -143,7 +147,7 @@ def response(flow):
}
if flow.server_conn.connected():
entry["serverIPAddress"] = str(flow.server_conn.ip_address[0])
entry["serverIPAddress"] = str(flow.server_conn.ip_address.address[0])
HAR["log"]["entries"].append(entry)
@@ -152,20 +156,25 @@ def done():
"""
Called once on script shutdown, after any other events.
"""
if ctx.options.hardump:
json_dump = json.dumps(HAR, indent=2) # type: str
dump_file = sys.argv[1]
if ctx.options.hardump == '-':
mitmproxy.ctx.log(json_dump)
else:
raw = json_dump.encode() # type: bytes
if ctx.options.hardump.endswith('.zhar'):
raw = zlib.compress(raw, 9)
json_dump = json.dumps(HAR, indent=2) # type: str
with open(os.path.expanduser(ctx.options.hardump), "wb") as f:
f.write(raw)
if dump_file == '-':
mitmproxy.ctx.log(json_dump)
else:
raw = json_dump.encode() # type: bytes
if dump_file.endswith('.zhar'):
raw = zlib.compress(raw, 9)
mitmproxy.ctx.log("HAR dump finished (wrote %s bytes to file)" % len(json_dump))
with open(os.path.expanduser(dump_file), "wb") as f:
f.write(raw)
mitmproxy.ctx.log("HAR dump finished (wrote %s bytes to file)" % len(json_dump))
def format_datetime(dt):
return dt.replace(tzinfo=pytz.timezone("UTC")).isoformat()
def format_cookies(cookie_list):
@@ -189,7 +198,7 @@ def format_cookies(cookie_list):
# Expiration time needs to be formatted
expire_ts = cookies.get_expiration_ts(attrs)
if expire_ts is not None:
cookie_har["expires"] = datetime.fromtimestamp(expire_ts, timezone.utc).isoformat()
cookie_har["expires"] = format_datetime(datetime.fromtimestamp(expire_ts))
rv.append(cookie_har)
@@ -201,7 +210,7 @@ def format_request_cookies(fields):
def format_response_cookies(fields):
return format_cookies((c[0], c[1][0], c[1][1]) for c in fields)
return format_cookies((c[0], c[1].value, c[1].attrs) for c in fields)
def name_value(obj):

View File

@@ -14,6 +14,6 @@ Usage:
"""
def load(l):
def start():
import pydevd
pydevd.settrace("localhost", port=5678, stdoutToServer=True, stderrToServer=True)

View File

@@ -23,10 +23,10 @@ Authors: Maximilian Hils, Matthew Tuusberg
import collections
import random
import sys
from enum import Enum
import mitmproxy
from mitmproxy import ctx
from mitmproxy.exceptions import TlsProtocolException
from mitmproxy.proxy.protocol import TlsLayer, RawTCPLayer
@@ -112,16 +112,10 @@ class TlsFeedback(TlsLayer):
tls_strategy = None
def load(l):
l.add_option(
"tlsstrat", int, 0, "TLS passthrough strategy (0-100)",
)
def configure(updated):
def start():
global tls_strategy
if ctx.options.tlsstrat > 0:
tls_strategy = ProbabilisticStrategy(float(ctx.options.tlsstrat) / 100.0)
if len(sys.argv) == 2:
tls_strategy = ProbabilisticStrategy(float(sys.argv[1]))
else:
tls_strategy = ConservativeStrategy()

View File

@@ -1,407 +0,0 @@
"""
__ __ _____ _____ _____
\ \ / // ____/ ____| / ____|
\ V /| (___| (___ | (___ ___ __ _ _ __ _ __ ___ _ __
> < \___ \\___ \ \___ \ / __/ _` | '_ \| '_ \ / _ \ '__|
/ . \ ____) |___) | ____) | (_| (_| | | | | | | | __/ |
/_/ \_\_____/_____/ |_____/ \___\__,_|_| |_|_| |_|\___|_|
This script automatically scans all visited webpages for XSS and SQLi vulnerabilities.
Usage: mitmproxy -s xss_scanner.py
This script scans for vulnerabilities by injecting a fuzzing payload (see PAYLOAD below) into 4 different places
and examining the HTML to look for XSS and SQLi injection vulnerabilities. The XSS scanning functionality works by
looking to see whether it is possible to inject HTML based off of of where the payload appears in the page and what
characters are escaped. In addition, it also looks for any script tags that load javascript from unclaimed domains.
The SQLi scanning functionality works by using regular expressions to look for errors from a number of different
common databases. Since it is only looking for errors, it will not find blind SQLi vulnerabilities.
The 4 places it injects the payload into are:
1. URLs (e.g. https://example.com/ -> https://example.com/PAYLOAD/)
2. Queries (e.g. https://example.com/index.html?a=b -> https://example.com/index.html?a=PAYLOAD)
3. Referers (e.g. The referer changes from https://example.com to PAYLOAD)
4. User Agents (e.g. The UA changes from Chrome to PAYLOAD)
Reports from this script show up in the event log (viewable by pressing e) and formatted like:
===== XSS Found ====
XSS URL: http://daviddworken.com/vulnerableUA.php
Injection Point: User Agent
Suggested Exploit: <script>alert(0)</script>
Line: 1029zxcs'd"ao<ac>so[sb]po(pc)se;sl/bsl\eq=3847asd
"""
from mitmproxy import ctx
from socket import gaierror, gethostbyname
from urllib.parse import urlparse
import requests
import re
from html.parser import HTMLParser
from mitmproxy import http
from typing import Dict, Union, Tuple, Optional, List, NamedTuple
# The actual payload is put between a frontWall and a backWall to make it easy
# to locate the payload with regular expressions
FRONT_WALL = b"1029zxc"
BACK_WALL = b"3847asd"
PAYLOAD = b"""s'd"ao<ac>so[sb]po(pc)se;sl/bsl\\eq="""
FULL_PAYLOAD = FRONT_WALL + PAYLOAD + BACK_WALL
# A XSSData is a named tuple with the following fields:
# - url -> str
# - injection_point -> str
# - exploit -> str
# - line -> str
XSSData = NamedTuple('XSSData', [('url', str),
('injection_point', str),
('exploit', str),
('line', str)])
# A SQLiData is named tuple with the following fields:
# - url -> str
# - injection_point -> str
# - regex -> str
# - dbms -> str
SQLiData = NamedTuple('SQLiData', [('url', str),
('injection_point', str),
('regex', str),
('dbms', str)])
VulnData = Tuple[Optional[XSSData], Optional[SQLiData]]
Cookies = Dict[str, str]
def get_cookies(flow: http.HTTPFlow) -> Cookies:
""" Return a dict going from cookie names to cookie values
- Note that it includes both the cookies sent in the original request and
the cookies sent by the server """
return {name: value for name, value in flow.request.cookies.fields}
def find_unclaimed_URLs(body: Union[str, bytes], requestUrl: bytes) -> None:
""" Look for unclaimed URLs in script tags and log them if found"""
class ScriptURLExtractor(HTMLParser):
script_URLs = []
def handle_starttag(self, tag, attrs):
if tag == "script" and "src" in [name for name, value in attrs]:
for name, value in attrs:
if name == "src":
self.script_URLs.append(value)
parser = ScriptURLExtractor()
try:
parser.feed(body)
except TypeError:
parser.feed(body.decode('utf-8'))
for url in parser.script_URLs:
parser = urlparse(url)
domain = parser.netloc
try:
gethostbyname(domain)
except gaierror:
ctx.log.error("XSS found in %s due to unclaimed URL \"%s\" in script tag." % (requestUrl, url))
def test_end_of_URL_injection(original_body: str, request_URL: str, cookies: Cookies) -> VulnData:
""" Test the given URL for XSS via injection onto the end of the URL and
log the XSS if found """
parsed_URL = urlparse(request_URL)
path = parsed_URL.path
if path != "" and path[-1] != "/": # ensure the path ends in a /
path += "/"
path += FULL_PAYLOAD.decode('utf-8') # the path must be a string while the payload is bytes
url = parsed_URL._replace(path=path).geturl()
body = requests.get(url, cookies=cookies).text.lower()
xss_info = get_XSS_data(body, url, "End of URL")
sqli_info = get_SQLi_data(body, original_body, url, "End of URL")
return xss_info, sqli_info
def test_referer_injection(original_body: str, request_URL: str, cookies: Cookies) -> VulnData:
""" Test the given URL for XSS via injection into the referer and
log the XSS if found """
body = requests.get(request_URL, headers={'referer': FULL_PAYLOAD}, cookies=cookies).text.lower()
xss_info = get_XSS_data(body, request_URL, "Referer")
sqli_info = get_SQLi_data(body, original_body, request_URL, "Referer")
return xss_info, sqli_info
def test_user_agent_injection(original_body: str, request_URL: str, cookies: Cookies) -> VulnData:
""" Test the given URL for XSS via injection into the user agent and
log the XSS if found """
body = requests.get(request_URL, headers={'User-Agent': FULL_PAYLOAD}, cookies=cookies).text.lower()
xss_info = get_XSS_data(body, request_URL, "User Agent")
sqli_info = get_SQLi_data(body, original_body, request_URL, "User Agent")
return xss_info, sqli_info
def test_query_injection(original_body: str, request_URL: str, cookies: Cookies):
""" Test the given URL for XSS via injection into URL queries and
log the XSS if found """
parsed_URL = urlparse(request_URL)
query_string = parsed_URL.query
# queries is a list of parameters where each parameter is set to the payload
queries = [query.split("=")[0] + "=" + FULL_PAYLOAD.decode('utf-8') for query in query_string.split("&")]
new_query_string = "&".join(queries)
new_URL = parsed_URL._replace(query=new_query_string).geturl()
body = requests.get(new_URL, cookies=cookies).text.lower()
xss_info = get_XSS_data(body, new_URL, "Query")
sqli_info = get_SQLi_data(body, original_body, new_URL, "Query")
return xss_info, sqli_info
def log_XSS_data(xss_info: Optional[XSSData]) -> None:
""" Log information about the given XSS to mitmproxy """
# If it is None, then there is no info to log
if not xss_info:
return
ctx.log.error("===== XSS Found ====")
ctx.log.error("XSS URL: %s" % xss_info.url)
ctx.log.error("Injection Point: %s" % xss_info.injection_point)
ctx.log.error("Suggested Exploit: %s" % xss_info.exploit)
ctx.log.error("Line: %s" % xss_info.line)
def log_SQLi_data(sqli_info: Optional[SQLiData]) -> None:
""" Log information about the given SQLi to mitmproxy """
if not sqli_info:
return
ctx.log.error("===== SQLi Found =====")
ctx.log.error("SQLi URL: %s" % sqli_info.url.decode('utf-8'))
ctx.log.error("Injection Point: %s" % sqli_info.injection_point.decode('utf-8'))
ctx.log.error("Regex used: %s" % sqli_info.regex.decode('utf-8'))
ctx.log.error("Suspected DBMS: %s" % sqli_info.dbms.decode('utf-8'))
def get_SQLi_data(new_body: str, original_body: str, request_URL: str, injection_point: str) -> Optional[SQLiData]:
""" Return a SQLiDict if there is a SQLi otherwise return None
String String URL String -> (SQLiDict or None) """
# Regexes taken from Damn Small SQLi Scanner: https://github.com/stamparm/DSSS/blob/master/dsss.py#L17
DBMS_ERRORS = {
"MySQL": (r"SQL syntax.*MySQL", r"Warning.*mysql_.*", r"valid MySQL result", r"MySqlClient\."),
"PostgreSQL": (r"PostgreSQL.*ERROR", r"Warning.*\Wpg_.*", r"valid PostgreSQL result", r"Npgsql\."),
"Microsoft SQL Server": (r"Driver.* SQL[\-\_\ ]*Server", r"OLE DB.* SQL Server", r"(\W|\A)SQL Server.*Driver",
r"Warning.*mssql_.*", r"(\W|\A)SQL Server.*[0-9a-fA-F]{8}",
r"(?s)Exception.*\WSystem\.Data\.SqlClient\.", r"(?s)Exception.*\WRoadhouse\.Cms\."),
"Microsoft Access": (r"Microsoft Access Driver", r"JET Database Engine", r"Access Database Engine"),
"Oracle": (r"\bORA-[0-9][0-9][0-9][0-9]", r"Oracle error", r"Oracle.*Driver", r"Warning.*\Woci_.*", r"Warning.*\Wora_.*"),
"IBM DB2": (r"CLI Driver.*DB2", r"DB2 SQL error", r"\bdb2_\w+\("),
"SQLite": (r"SQLite/JDBCDriver", r"SQLite.Exception", r"System.Data.SQLite.SQLiteException", r"Warning.*sqlite_.*",
r"Warning.*SQLite3::", r"\[SQLITE_ERROR\]"),
"Sybase": (r"(?i)Warning.*sybase.*", r"Sybase message", r"Sybase.*Server message.*"),
}
for dbms, regexes in DBMS_ERRORS.items():
for regex in regexes:
if re.search(regex, new_body) and not re.search(regex, original_body):
return SQLiData(request_URL,
injection_point,
regex,
dbms)
# A qc is either ' or "
def inside_quote(qc: str, substring: bytes, text_index: int, body: bytes) -> bool:
""" Whether the Numberth occurence of the first string in the second
string is inside quotes as defined by the supplied QuoteChar """
substring = substring.decode('utf-8')
body = body.decode('utf-8')
num_substrings_found = 0
in_quote = False
for index, char in enumerate(body):
# Whether the next chunk of len(substring) chars is the substring
next_part_is_substring = (
(not (index + len(substring) > len(body))) and
(body[index:index + len(substring)] == substring)
)
# Whether this char is escaped with a \
is_not_escaped = (
(index - 1 < 0 or index - 1 > len(body)) or
(body[index - 1] != "\\")
)
if char == qc and is_not_escaped:
in_quote = not in_quote
if next_part_is_substring:
if num_substrings_found == text_index:
return in_quote
num_substrings_found += 1
return False
def paths_to_text(html: str, str: str) -> List[str]:
""" Return list of Paths to a given str in the given HTML tree
- Note that it does a BFS """
def remove_last_occurence_of_sub_string(str: str, substr: str):
""" Delete the last occurence of substr from str
String String -> String
"""
index = str.rfind(substr)
return str[:index] + str[index + len(substr):]
class PathHTMLParser(HTMLParser):
currentPath = ""
paths = []
def handle_starttag(self, tag, attrs):
self.currentPath += ("/" + tag)
def handle_endtag(self, tag):
self.currentPath = remove_last_occurence_of_sub_string(self.currentPath, "/" + tag)
def handle_data(self, data):
if str in data:
self.paths.append(self.currentPath)
parser = PathHTMLParser()
parser.feed(html)
return parser.paths
def get_XSS_data(body: str, request_URL: str, injection_point: str) -> Optional[XSSData]:
""" Return a XSSDict if there is a XSS otherwise return None """
def in_script(text, index, body) -> bool:
""" Whether the Numberth occurence of the first string in the second
string is inside a script tag """
paths = paths_to_text(body.decode('utf-8'), text.decode("utf-8"))
try:
path = paths[index]
return "script" in path
except IndexError:
return False
def in_HTML(text: bytes, index: int, body: bytes) -> bool:
""" Whether the Numberth occurence of the first string in the second
string is inside the HTML but not inside a script tag or part of
a HTML attribute"""
# if there is a < then lxml will interpret that as a tag, so only search for the stuff before it
text = text.split(b"<")[0]
paths = paths_to_text(body.decode('utf-8'), text.decode("utf-8"))
try:
path = paths[index]
return "script" not in path
except IndexError:
return False
def inject_javascript_handler(html: str) -> bool:
""" Whether you can inject a Javascript:alert(0) as a link """
class injectJSHandlerHTMLParser(HTMLParser):
injectJSHandler = False
def handle_starttag(self, tag, attrs):
for name, value in attrs:
if name == "href" and value.startswith(FRONT_WALL.decode('utf-8')):
self.injectJSHandler = True
parser = injectJSHandlerHTMLParser()
parser.feed(html)
return parser.injectJSHandler
# Only convert the body to bytes if needed
if isinstance(body, str):
body = bytes(body, 'utf-8')
# Regex for between 24 and 72 (aka 24*3) characters encapsulated by the walls
regex = re.compile(b"""%s.{24,72}?%s""" % (FRONT_WALL, BACK_WALL))
matches = regex.findall(body)
for index, match in enumerate(matches):
# Where the string is injected into the HTML
in_script = in_script(match, index, body)
in_HTML = in_HTML(match, index, body)
in_tag = not in_script and not in_HTML
in_single_quotes = inside_quote("'", match, index, body)
in_double_quotes = inside_quote('"', match, index, body)
# Whether you can inject:
inject_open_angle = b"ao<ac" in match # open angle brackets
inject_close_angle = b"ac>so" in match # close angle brackets
inject_single_quotes = b"s'd" in match # single quotes
inject_double_quotes = b'd"ao' in match # double quotes
inject_slash = b"sl/bsl" in match # forward slashes
inject_semi = b"se;sl" in match # semicolons
inject_equals = b"eq=" in match # equals sign
if in_script and inject_slash and inject_open_angle and inject_close_angle: # e.g. <script>PAYLOAD</script>
return XSSData(request_URL,
injection_point,
'</script><script>alert(0)</script><script>',
match.decode('utf-8'))
elif in_script and in_single_quotes and inject_single_quotes and inject_semi: # e.g. <script>t='PAYLOAD';</script>
return XSSData(request_URL,
injection_point,
"';alert(0);g='",
match.decode('utf-8'))
elif in_script and in_double_quotes and inject_double_quotes and inject_semi: # e.g. <script>t="PAYLOAD";</script>
return XSSData(request_URL,
injection_point,
'";alert(0);g="',
match.decode('utf-8'))
elif in_tag and in_single_quotes and inject_single_quotes and inject_open_angle and inject_close_angle and inject_slash:
# e.g. <a href='PAYLOAD'>Test</a>
return XSSData(request_URL,
injection_point,
"'><script>alert(0)</script>",
match.decode('utf-8'))
elif in_tag and in_double_quotes and inject_double_quotes and inject_open_angle and inject_close_angle and inject_slash:
# e.g. <a href="PAYLOAD">Test</a>
return XSSData(request_URL,
injection_point,
'"><script>alert(0)</script>',
match.decode('utf-8'))
elif in_tag and not in_double_quotes and not in_single_quotes and inject_open_angle and inject_close_angle and inject_slash:
# e.g. <a href=PAYLOAD>Test</a>
return XSSData(request_URL,
injection_point,
'><script>alert(0)</script>',
match.decode('utf-8'))
elif inject_javascript_handler(body.decode('utf-8')): # e.g. <html><a href=PAYLOAD>Test</a>
return XSSData(request_URL,
injection_point,
'Javascript:alert(0)',
match.decode('utf-8'))
elif in_tag and in_double_quotes and inject_double_quotes and inject_equals: # e.g. <a href="PAYLOAD">Test</a>
return XSSData(request_URL,
injection_point,
'" onmouseover="alert(0)" t="',
match.decode('utf-8'))
elif in_tag and in_single_quotes and inject_single_quotes and inject_equals: # e.g. <a href='PAYLOAD'>Test</a>
return XSSData(request_URL,
injection_point,
"' onmouseover='alert(0)' t='",
match.decode('utf-8'))
elif in_tag and not in_single_quotes and not in_double_quotes and inject_equals: # e.g. <a href=PAYLOAD>Test</a>
return XSSData(request_URL,
injection_point,
" onmouseover=alert(0) t=",
match.decode('utf-8'))
elif in_HTML and not in_script and inject_open_angle and inject_close_angle and inject_slash: # e.g. <html>PAYLOAD</html>
return XSSData(request_URL,
injection_point,
'<script>alert(0)</script>',
match.decode('utf-8'))
else:
return None
# response is mitmproxy's entry point
def response(flow: http.HTTPFlow) -> None:
cookiesDict = get_cookies(flow)
# Example: http://xss.guru/unclaimedScriptTag.html
find_unclaimed_URLs(flow.response.content, flow.request.url)
results = test_end_of_URL_injection(flow.response.content.decode('utf-8'), flow.request.url, cookiesDict)
log_XSS_data(results[0])
log_SQLi_data(results[1])
# Example: https://daviddworken.com/vulnerableReferer.php
results = test_referer_injection(flow.response.content.decode('utf-8'), flow.request.url, cookiesDict)
log_XSS_data(results[0])
log_SQLi_data(results[1])
# Example: https://daviddworken.com/vulnerableUA.php
results = test_user_agent_injection(flow.response.content.decode('utf-8'), flow.request.url, cookiesDict)
log_XSS_data(results[0])
log_SQLi_data(results[1])
if "?" in flow.request.url:
# Example: https://daviddworken.com/vulnerable.php?name=
results = test_query_injection(flow.response.content.decode('utf-8'), flow.request.url, cookiesDict)
log_XSS_data(results[0])
log_SQLi_data(results[1])

View File

@@ -1,5 +1,2 @@
from mitmproxy import http
def response(flow: http.HTTPFlow) -> None:
def response(flow):
flow.response.headers["newheader"] = "foo"

View File

@@ -1,9 +1,7 @@
from mitmproxy import http
class AddHeader:
def response(self, flow: http.HTTPFlow) -> None:
def response(self, flow):
flow.response.headers["newheader"] = "foo"
addons = [AddHeader()]
def start():
return AddHeader()

View File

@@ -3,10 +3,6 @@ This example shows how one can add a custom contentview to mitmproxy.
The content view API is explained in the mitmproxy.contentviews module.
"""
from mitmproxy import contentviews
import typing
CVIEWSWAPCASE = typing.Tuple[str, typing.Iterable[typing.List[typing.Tuple[str, typing.AnyStr]]]]
class ViewSwapCase(contentviews.View):
@@ -17,14 +13,14 @@ class ViewSwapCase(contentviews.View):
prompt = ("swap case text", "z")
content_types = ["text/plain"]
def __call__(self, data: typing.AnyStr, **metadata) -> CVIEWSWAPCASE:
def __call__(self, data: bytes, **metadata):
return "case-swapped text", contentviews.format_text(data.swapcase())
view = ViewSwapCase()
def load(l):
def start():
contentviews.add(view)

View File

@@ -1,11 +0,0 @@
from mitmproxy import ctx
def load(l):
ctx.log.info("Registering option 'custom'")
l.add_option("custom", bool, False, "A custom option")
def configure(updated):
if "custom" in updated:
ctx.log.info("custom option value: %s" % ctx.options.custom)

View File

@@ -1,26 +1,23 @@
"""
This scripts demonstrates how to use mitmproxy's filter pattern in scripts.
Usage:
mitmdump -s "flowfilter.py FILTER"
"""
import sys
from mitmproxy import flowfilter
from mitmproxy import ctx, http
class Filter:
def __init__(self):
self.filter = None # type: flowfilter.TFilter
def __init__(self, spec):
self.filter = flowfilter.parse(spec)
def configure(self, updated):
self.filter = flowfilter.parse(ctx.options.flowfilter)
def load(self, l):
l.add_option(
"flowfilter", str, "", "Check that flow matches filter."
)
def response(self, flow: http.HTTPFlow) -> None:
def response(self, flow):
if flowfilter.match(self.filter, flow):
print("Flow matches filter:")
print(flow)
addons = [Filter()]
def start():
if len(sys.argv) != 2:
raise ValueError("Usage: -s 'filt.py FILTER'")
return Filter(sys.argv[1])

View File

@@ -1,15 +1,13 @@
#!/usr/bin/env python
# type: ignore
#
# Simple script showing how to read a mitmproxy dump file
#
from mitmproxy import io
from mitmproxy.exceptions import FlowReadException
import pprint
import sys
with open(sys.argv[1], "rb") as logfile:
freader = io.FlowReader(logfile)
pp = pprint.PrettyPrinter(indent=4)

View File

@@ -7,21 +7,23 @@ to multiple files in parallel.
"""
import random
import sys
from mitmproxy import io, http
import typing # noqa
from mitmproxy import io
class Writer:
def __init__(self, path: str) -> None:
def __init__(self, path):
if path == "-":
f = sys.stdout # type: typing.IO[typing.Any]
f = sys.stdout
else:
f = open(path, "wb")
self.w = io.FlowWriter(f)
def response(self, flow: http.HTTPFlow) -> None:
def response(self, flow):
if random.choice([True, False]):
self.w.add(flow)
addons = [Writer(sys.argv[1])]
def start():
if len(sys.argv) != 2:
raise ValueError('Usage: -s "flowriter.py filename"')
return Writer(sys.argv[1])

View File

@@ -7,6 +7,6 @@ If you want to help us out: https://github.com/mitmproxy/mitmproxy/issues/1530 :
from mitmproxy import ctx
def load(l):
def start():
ctx.log.info("This is some informative text.")
ctx.log.error("This is an error.")

View File

@@ -1,26 +1,29 @@
# Usage: mitmdump -s "iframe_injector.py url"
# (this script works best with --anticache)
import sys
from bs4 import BeautifulSoup
from mitmproxy import ctx, http
class Injector:
def load(self, loader):
loader.add_option(
"iframe", str, "", "IFrame to inject"
)
def __init__(self, iframe_url):
self.iframe_url = iframe_url
def response(self, flow: http.HTTPFlow) -> None:
if ctx.options.iframe:
html = BeautifulSoup(flow.response.content, "html.parser")
if html.body:
iframe = html.new_tag(
"iframe",
src=ctx.options.iframe,
frameborder=0,
height=0,
width=0)
html.body.insert(0, iframe)
flow.response.content = str(html).encode("utf8")
def response(self, flow):
if flow.request.host in self.iframe_url:
return
html = BeautifulSoup(flow.response.content, "html.parser")
if html.body:
iframe = html.new_tag(
"iframe",
src=self.iframe_url,
frameborder=0,
height=0,
width=0)
html.body.insert(0, iframe)
flow.response.content = str(html).encode("utf8")
addons = [Injector()]
def start():
if len(sys.argv) != 2:
raise ValueError('Usage: -s "iframe_injector.py url"')
return Injector(sys.argv[1])

View File

@@ -1,7 +1,4 @@
from mitmproxy import http
def request(flow: http.HTTPFlow) -> None:
def request(flow):
if flow.request.urlencoded_form:
# If there's already a form, one can just add items to the dict:
flow.request.urlencoded_form["mitmproxy"] = "rocks"

View File

@@ -1,5 +1,2 @@
from mitmproxy import http
def request(flow: http.HTTPFlow) -> None:
def request(flow):
flow.request.query["mitmproxy"] = "rocks"

View File

@@ -1,10 +1,9 @@
"""
This example shows two ways to redirect flows to another server.
"""
from mitmproxy import http
def request(flow: http.HTTPFlow) -> None:
def request(flow):
# pretty_host takes the "Host" header of the request into account,
# which is useful in transparent mode where we usually only have the IP
# otherwise.

View File

@@ -0,0 +1,17 @@
import argparse
class Replacer:
def __init__(self, src, dst):
self.src, self.dst = src, dst
def response(self, flow):
flow.response.replace(self.src, self.dst)
def start():
parser = argparse.ArgumentParser()
parser.add_argument("src", type=str)
parser.add_argument("dst", type=str)
args = parser.parse_args()
return Replacer(args.src, args.dst)

View File

@@ -5,7 +5,7 @@ without sending any data to the remote server.
from mitmproxy import http
def request(flow: http.HTTPFlow) -> None:
def request(flow):
# pretty_url takes the "Host" header of the request into account, which
# is useful in transparent mode where we usually only have the IP otherwise.

View File

@@ -2,11 +2,11 @@
This script rotates all images passing through the proxy by 180 degrees.
"""
import io
from PIL import Image
from mitmproxy import http
def response(flow: http.HTTPFlow) -> None:
def response(flow):
if flow.response.headers.get("content-type", "").startswith("image"):
s = io.BytesIO(flow.response.content)
img = Image.open(s).rotate(180)

View File

@@ -10,14 +10,14 @@ app = Flask("proxapp")
@app.route('/')
def hello_world() -> str:
def hello_world():
return 'Hello World!'
def load(l):
# Host app at the magic domain "proxapp.local" on port 80. Requests to this
def start():
# Host app at the magic domain "proxapp" on port 80. Requests to this
# domain and port combination will now be routed to the WSGI app instance.
return wsgiapp.WSGIApp(app, "proxapp.local", 80)
return wsgiapp.WSGIApp(app, "proxapp", 80)
# SSL works too, but the magic domain needs to be resolvable from the mitmproxy machine due to mitmproxy's design.
# mitmproxy will connect to said domain and use serve its certificate (unless --no-upstream-cert is set)

View File

@@ -1,13 +1,4 @@
import typing
import traceback
import contextlib
import sys
from mitmproxy import exceptions
from mitmproxy import eventsequence
from mitmproxy import controller
from mitmproxy import flow
from . import ctx
import pprint
@@ -15,115 +6,18 @@ def _get_name(itm):
return getattr(itm, "name", itm.__class__.__name__.lower())
def cut_traceback(tb, func_name):
"""
Cut off a traceback at the function with the given name.
The func_name's frame is excluded.
Args:
tb: traceback object, as returned by sys.exc_info()[2]
func_name: function name
Returns:
Reduced traceback.
"""
tb_orig = tb
for _, _, fname, _ in traceback.extract_tb(tb):
tb = tb.tb_next
if fname == func_name:
break
return tb or tb_orig
class StreamLog:
"""
A class for redirecting output using contextlib.
"""
def __init__(self, log):
self.log = log
def write(self, buf):
if buf.strip():
self.log(buf)
def flush(self): # pragma: no cover
# Click uses flush sometimes, so we dummy it up
pass
@contextlib.contextmanager
def safecall():
stdout_replacement = StreamLog(ctx.log.warn)
try:
with contextlib.redirect_stdout(stdout_replacement):
yield
except (exceptions.AddonHalt, exceptions.OptionsError):
raise
except Exception as e:
etype, value, tb = sys.exc_info()
tb = cut_traceback(tb, "invoke_addon").tb_next
ctx.log.error(
"Addon error: %s" % "".join(
traceback.format_exception(etype, value, tb)
)
)
class Loader:
"""
A loader object is passed to the load() event when addons start up.
"""
def __init__(self, master):
self.master = master
def add_option(
self,
name: str,
typespec: type,
default: typing.Any,
help: str,
choices: typing.Optional[typing.Sequence[str]] = None
) -> None:
if name in self.master.options:
ctx.log.warn("Over-riding existing option %s" % name)
self.master.options.add_option(
name,
typespec,
default,
help,
choices
)
def add_command(self, path: str, func: typing.Callable) -> None:
self.master.commands.add(path, func)
def traverse(chain):
"""
Recursively traverse an addon chain.
"""
for a in chain:
yield a
if hasattr(a, "addons"):
yield from traverse(a.addons)
class AddonManager:
def __init__(self, master):
self.lookup = {}
self.chain = []
self.master = master
master.options.changed.connect(self._configure_all)
def _configure_all(self, options, updated):
self.trigger("configure", updated)
master.options.changed.connect(self._options_update)
def clear(self):
"""
Remove all addons.
"""
for i in self.chain:
self.remove(i)
self.done()
self.chain = []
def get(self, name):
"""
@@ -131,58 +25,45 @@ class AddonManager:
attribute on the instance, or the lower case class name if that
does not exist.
"""
return self.lookup.get(name, None)
for i in self.chain:
if name == _get_name(i):
return i
def register(self, addon):
"""
Register an addon, call its load event, and then register all its
sub-addons. This should be used by addons that dynamically manage
addons.
def _options_update(self, options, updated):
for i in self.chain:
with self.master.handlecontext():
self.invoke_with_context(i, "configure", options, updated)
If the calling addon is already running, it should follow with
running and configure events. Must be called within a current
context.
def startup(self, s):
"""
for a in traverse([addon]):
name = _get_name(a)
if name in self.lookup:
raise exceptions.AddonManagerError(
"An addon called '%s' already exists." % name
)
l = Loader(self.master)
self.invoke_addon(addon, "load", l)
for a in traverse([addon]):
name = _get_name(a)
self.lookup[name] = a
for a in traverse([addon]):
self.master.commands.collect_commands(a)
return addon
Run startup events on addon.
"""
self.invoke_with_context(s, "start")
self.invoke_with_context(
s,
"configure",
self.master.options,
self.master.options.keys()
)
def add(self, *addons):
"""
Add addons to the end of the chain, and run their load event.
If any addon has sub-addons, they are registered.
Add addons to the end of the chain, and run their startup events.
"""
with self.master.handlecontext():
for i in addons:
self.chain.append(self.register(i))
self.chain.extend(addons)
for i in addons:
self.startup(i)
def remove(self, addon):
"""
Remove an addon and all its sub-addons.
If the addon is not in the chain - that is, if it's managed by a
parent addon - it's the parent's responsibility to remove it from
its own addons attribute.
Remove an addon from the chain, and run its done events.
"""
for a in traverse([addon]):
n = _get_name(a)
if n not in self.lookup:
raise exceptions.AddonManagerError("No such addon: %s" % n)
self.chain = [i for i in self.chain if i is not a]
del self.lookup[_get_name(a)]
with self.master.handlecontext():
self.invoke_addon(a, "done")
self.chain = [i for i in self.chain if i is not addon]
self.invoke_with_context(addon, "done")
def done(self):
for i in self.chain:
self.invoke_with_context(i, "done")
def __len__(self):
return len(self.chain)
@@ -190,59 +71,22 @@ class AddonManager:
def __str__(self):
return pprint.pformat([str(i) for i in self.chain])
def handle_lifecycle(self, name, message):
"""
Handle a lifecycle event.
"""
if not hasattr(message, "reply"): # pragma: no cover
raise exceptions.ControlException(
"Message %s has no reply attribute" % message
)
# We can use DummyReply objects multiple times. We only clear them up on
# the next handler so that we can access value and state in the
# meantime.
if isinstance(message.reply, controller.DummyReply):
message.reply.reset()
self.trigger(name, message)
if message.reply.state != "taken":
message.reply.take()
if not message.reply.has_message:
message.reply.ack()
message.reply.commit()
if isinstance(message.reply, controller.DummyReply):
message.reply.mark_reset()
if isinstance(message, flow.Flow):
self.trigger("update", [message])
def invoke_addon(self, addon, name, *args, **kwargs):
"""
Invoke an event on an addon and all its children. This method must
run within an established handler context.
"""
if name not in eventsequence.Events:
name = "event_" + name
for a in traverse([addon]):
func = getattr(a, name, None)
if func:
if not callable(func):
raise exceptions.AddonManagerError(
"Addon handler %s not callable" % name
)
func(*args, **kwargs)
def trigger(self, name, *args, **kwargs):
"""
Establish a handler context and trigger an event across all addons
"""
def invoke_with_context(self, addon, name, *args, **kwargs):
with self.master.handlecontext():
for i in self.chain:
try:
with safecall():
self.invoke_addon(i, name, *args, **kwargs)
except exceptions.AddonHalt:
return
self.invoke(addon, name, *args, **kwargs)
def invoke(self, addon, name, *args, **kwargs):
func = getattr(addon, name, None)
if func:
if not callable(func):
raise exceptions.AddonError(
"Addon handler %s not callable" % name
)
func(*args, **kwargs)
def __call__(self, name, *args, **kwargs):
for i in self.chain:
try:
self.invoke(i, name, *args, **kwargs)
except exceptions.AddonHalt:
return

View File

@@ -3,11 +3,7 @@ from mitmproxy.addons import anticomp
from mitmproxy.addons import check_alpn
from mitmproxy.addons import check_ca
from mitmproxy.addons import clientplayback
from mitmproxy.addons import core_option_validation
from mitmproxy.addons import core
from mitmproxy.addons import cut
from mitmproxy.addons import disable_h2c
from mitmproxy.addons import export
from mitmproxy.addons import disable_h2c_upgrade
from mitmproxy.addons import onboarding
from mitmproxy.addons import proxyauth
from mitmproxy.addons import replace
@@ -17,31 +13,28 @@ from mitmproxy.addons import setheaders
from mitmproxy.addons import stickyauth
from mitmproxy.addons import stickycookie
from mitmproxy.addons import streambodies
from mitmproxy.addons import save
from mitmproxy.addons import streamfile
from mitmproxy.addons import upstream_auth
def default_addons():
return [
core.Core(),
core_option_validation.CoreOptionValidation(),
anticache.AntiCache(),
anticomp.AntiComp(),
check_alpn.CheckALPN(),
check_ca.CheckCA(),
clientplayback.ClientPlayback(),
cut.Cut(),
disable_h2c.DisableH2C(),
export.Export(),
disable_h2c_upgrade.DisableH2CleartextUpgrade(),
onboarding.Onboarding(),
proxyauth.ProxyAuth(),
replace.Replace(),
replace.ReplaceFile(),
script.ScriptLoader(),
serverplayback.ServerPlayback(),
setheaders.SetHeaders(),
stickyauth.StickyAuth(),
stickycookie.StickyCookie(),
streambodies.StreamBodies(),
save.Save(),
streamfile.StreamFile(),
upstream_auth.UpstreamAuth(),
]

View File

@@ -1,7 +1,10 @@
from mitmproxy import ctx
class AntiCache:
def __init__(self):
self.enabled = False
def configure(self, options, updated):
self.enabled = options.anticache
def request(self, flow):
if ctx.options.anticache:
if self.enabled:
flow.request.anticache()

View File

@@ -1,7 +1,10 @@
from mitmproxy import ctx
class AntiComp:
def __init__(self):
self.enabled = False
def configure(self, options, updated):
self.enabled = options.anticomp
def request(self, flow):
if ctx.options.anticomp:
if self.enabled:
flow.request.anticomp()

View File

@@ -1,17 +1,17 @@
import mitmproxy
from mitmproxy.net import tcp
from mitmproxy import ctx
class CheckALPN:
def __init__(self):
self.failed = False
def configure(self, updated):
def configure(self, options, updated):
self.failed = mitmproxy.ctx.master.options.http2 and not tcp.HAS_ALPN
if self.failed:
ctx.log.warn(
mitmproxy.ctx.master.add_log(
"HTTP/2 is disabled because ALPN support missing!\n"
"OpenSSL 1.0.2+ required to support HTTP/2 connections.\n"
"Use --no-http2 to silence this warning."
"Use --no-http2 to silence this warning.",
"warn",
)

View File

@@ -5,7 +5,7 @@ class CheckCA:
def __init__(self):
self.failed = False
def configure(self, updated):
def configure(self, options, updated):
has_ca = (
mitmproxy.ctx.master.server and
mitmproxy.ctx.master.server.config and

View File

@@ -2,7 +2,6 @@ from mitmproxy import exceptions
from mitmproxy import ctx
from mitmproxy import io
from mitmproxy import flow
from mitmproxy import command
import typing
@@ -11,56 +10,36 @@ class ClientPlayback:
def __init__(self):
self.flows = None
self.current_thread = None
self.keepserving = False
self.has_replayed = False
self.configured = False
def count(self) -> int:
if self.flows:
return len(self.flows)
return 0
@command.command("replay.client.stop")
def stop_replay(self) -> None:
"""
Stop client replay.
"""
self.flows = []
ctx.master.addons.trigger("update", [])
@command.command("replay.client")
def start_replay(self, flows: typing.Sequence[flow.Flow]) -> None:
"""
Replay requests from flows.
"""
self.flows = flows
ctx.master.addons.trigger("update", [])
@command.command("replay.client.file")
def load_file(self, path: str) -> None:
try:
flows = io.read_flows_from_paths([path])
except exceptions.FlowReadException as e:
raise exceptions.CommandError(str(e))
def load(self, flows: typing.Sequence[flow.Flow]):
self.flows = flows
def configure(self, updated):
if not self.configured and ctx.options.client_replay:
self.configured = True
ctx.log.info("Client Replay: {}".format(ctx.options.client_replay))
try:
flows = io.read_flows_from_paths(ctx.options.client_replay)
except exceptions.FlowReadException as e:
raise exceptions.OptionsError(str(e))
self.start_replay(flows)
def configure(self, options, updated):
if "client_replay" in updated:
if options.client_replay:
ctx.log.info("Client Replay: {}".format(options.client_replay))
try:
flows = io.read_flows_from_paths(options.client_replay)
except exceptions.FlowReadException as e:
raise exceptions.OptionsError(str(e))
self.load(flows)
else:
self.flows = None
self.keepserving = options.keepserving
def tick(self):
if self.current_thread and not self.current_thread.is_alive():
self.current_thread = None
if self.flows and not self.current_thread:
f = self.flows.pop(0)
self.current_thread = ctx.master.replay_request(f)
ctx.master.addons.trigger("update", [f])
self.current_thread = ctx.master.replay_request(self.flows.pop(0))
self.has_replayed = True
if self.has_replayed:
if not self.flows and not self.current_thread:
ctx.master.addons.trigger("processing_complete")
if not self.flows and not self.current_thread and not self.keepserving:
ctx.master.shutdown()

View File

@@ -1,259 +0,0 @@
import typing
from mitmproxy import ctx
from mitmproxy import exceptions
from mitmproxy import command
from mitmproxy import flow
from mitmproxy import optmanager
from mitmproxy.net.http import status_codes
class Core:
@command.command("set")
def set(self, spec: str) -> None:
"""
Set an option of the form "key[=value]". When the value is omitted,
booleans are set to true, strings and integers are set to None (if
permitted), and sequences are emptied. Boolean values can be true,
false or toggle.
"""
try:
ctx.options.set(spec)
except exceptions.OptionsError as e:
raise exceptions.CommandError(e) from e
@command.command("flow.resume")
def resume(self, flows: typing.Sequence[flow.Flow]) -> None:
"""
Resume flows if they are intercepted.
"""
intercepted = [i for i in flows if i.intercepted]
for f in intercepted:
f.resume()
ctx.master.addons.trigger("update", intercepted)
# FIXME: this will become view.mark later
@command.command("flow.mark")
def mark(self, flows: typing.Sequence[flow.Flow], val: bool) -> None:
"""
Mark flows.
"""
updated = []
for i in flows:
if i.marked != val:
i.marked = val
updated.append(i)
ctx.master.addons.trigger("update", updated)
# FIXME: this will become view.mark.toggle later
@command.command("flow.mark.toggle")
def mark_toggle(self, flows: typing.Sequence[flow.Flow]) -> None:
"""
Toggle mark for flows.
"""
for i in flows:
i.marked = not i.marked
ctx.master.addons.trigger("update", flows)
@command.command("flow.kill")
def kill(self, flows: typing.Sequence[flow.Flow]) -> None:
"""
Kill running flows.
"""
updated = []
for f in flows:
if f.killable:
f.kill()
updated.append(f)
ctx.log.alert("Killed %s flows." % len(updated))
ctx.master.addons.trigger("update", updated)
# FIXME: this will become view.revert later
@command.command("flow.revert")
def revert(self, flows: typing.Sequence[flow.Flow]) -> None:
"""
Revert flow changes.
"""
updated = []
for f in flows:
if f.modified():
f.revert()
updated.append(f)
ctx.log.alert("Reverted %s flows." % len(updated))
ctx.master.addons.trigger("update", updated)
@command.command("flow.set.options")
def flow_set_options(self) -> typing.Sequence[str]:
return [
"host",
"status_code",
"method",
"path",
"url",
"reason",
]
@command.command("flow.set")
def flow_set(
self,
flows: typing.Sequence[flow.Flow], spec: str, sval: str
) -> None:
"""
Quickly set a number of common values on flows.
"""
opts = self.flow_set_options()
if spec not in opts:
raise exceptions.CommandError(
"Set spec must be one of: %s." % ", ".join(opts)
)
val = sval # type: typing.Union[int, str]
if spec == "status_code":
try:
val = int(val)
except ValueError as v:
raise exceptions.CommandError(
"Status code is not an integer: %s" % val
) from v
updated = []
for f in flows:
req = getattr(f, "request", None)
rupdate = True
if req:
if spec == "method":
req.method = val
elif spec == "host":
req.host = val
elif spec == "path":
req.path = val
elif spec == "url":
try:
req.url = val
except ValueError as e:
raise exceptions.CommandError(
"URL %s is invalid: %s" % (repr(val), e)
) from e
else:
self.rupdate = False
resp = getattr(f, "response", None)
supdate = True
if resp:
if spec == "status_code":
resp.status_code = val
if val in status_codes.RESPONSES:
resp.reason = status_codes.RESPONSES[int(val)]
elif spec == "reason":
resp.reason = val
else:
supdate = False
if rupdate or supdate:
updated.append(f)
ctx.master.addons.trigger("update", updated)
ctx.log.alert("Set %s on %s flows." % (spec, len(updated)))
@command.command("flow.decode")
def decode(self, flows: typing.Sequence[flow.Flow], part: str) -> None:
"""
Decode flows.
"""
updated = []
for f in flows:
p = getattr(f, part, None)
if p:
p.decode()
updated.append(f)
ctx.master.addons.trigger("update", updated)
ctx.log.alert("Decoded %s flows." % len(updated))
@command.command("flow.encode.toggle")
def encode_toggle(self, flows: typing.Sequence[flow.Flow], part: str) -> None:
"""
Toggle flow encoding on and off, using deflate for encoding.
"""
updated = []
for f in flows:
p = getattr(f, part, None)
if p:
current_enc = p.headers.get("content-encoding", "identity")
if current_enc == "identity":
p.encode("deflate")
else:
p.decode()
updated.append(f)
ctx.master.addons.trigger("update", updated)
ctx.log.alert("Toggled encoding on %s flows." % len(updated))
@command.command("flow.encode")
def encode(self, flows: typing.Sequence[flow.Flow], part: str, enc: str) -> None:
"""
Encode flows with a specified encoding.
"""
if enc not in self.encode_options():
raise exceptions.CommandError("Invalid encoding format: %s" % enc)
updated = []
for f in flows:
p = getattr(f, part, None)
if p:
current_enc = p.headers.get("content-encoding", "identity")
if current_enc == "identity":
p.encode(enc)
updated.append(f)
ctx.master.addons.trigger("update", updated)
ctx.log.alert("Encoded %s flows." % len(updated))
@command.command("flow.encode.options")
def encode_options(self) -> typing.Sequence[str]:
"""
The possible values for an encoding specification.
"""
return ["gzip", "deflate", "br"]
@command.command("options.load")
def options_load(self, path: str) -> None:
"""
Load options from a file.
"""
try:
optmanager.load_paths(ctx.options, path)
except (OSError, exceptions.OptionsError) as e:
raise exceptions.CommandError(
"Could not load options - %s" % e
) from e
@command.command("options.save")
def options_save(self, path: str) -> None:
"""
Save options to a file.
"""
try:
optmanager.save(ctx.options, path)
except OSError as e:
raise exceptions.CommandError(
"Could not save options - %s" % e
) from e
@command.command("options.reset")
def options_reset(self) -> None:
"""
Reset all options to defaults.
"""
ctx.options.reset()
@command.command("options.reset.one")
def options_reset_one(self, name: str) -> None:
"""
Reset one option to its default value.
"""
if name not in ctx.options:
raise exceptions.CommandError("No such option: %s" % name)
setattr(
ctx.options,
name,
ctx.options.default(name),
)

View File

@@ -1,47 +0,0 @@
"""
The core addon is responsible for verifying core settings that are not
checked by other addons.
"""
from mitmproxy import exceptions
from mitmproxy import platform
from mitmproxy import ctx
from mitmproxy.net import server_spec
from mitmproxy.utils import human
class CoreOptionValidation:
def configure(self, updated):
opts = ctx.options
if opts.add_upstream_certs_to_client_chain and not opts.upstream_cert:
raise exceptions.OptionsError(
"The no-upstream-cert and add-upstream-certs-to-client-chain "
"options are mutually exclusive. If no-upstream-cert is enabled "
"then the upstream certificate is not retrieved before generating "
"the client certificate chain."
)
if "body_size_limit" in updated and opts.body_size_limit:
try:
opts._processed["body_size_limit"] = human.parse_size(
opts.body_size_limit
)
except ValueError as e:
raise exceptions.OptionsError(
"Invalid body size limit specification: %s" %
opts.body_size_limit
)
if "mode" in updated:
mode = opts.mode
if mode.startswith("reverse:") or mode.startswith("upstream:"):
try:
server_spec.parse_with_mode(mode)
except ValueError as e:
raise exceptions.OptionsError(str(e)) from e
elif mode == "transparent":
if not platform.original_addr:
raise exceptions.OptionsError(
"Transparent mode not supported on this platform."
)
elif mode not in ["regular", "socks5"]:
raise exceptions.OptionsError(
"Invalid mode specification: %s" % mode
)

View File

@@ -1,151 +0,0 @@
import io
import csv
import typing
from mitmproxy import command
from mitmproxy import exceptions
from mitmproxy import flow
from mitmproxy import ctx
from mitmproxy import certs
from mitmproxy.utils import strutils
import pyperclip
def headername(spec: str):
if not (spec.startswith("header[") and spec.endswith("]")):
raise exceptions.CommandError("Invalid header spec: %s" % spec)
return spec[len("header["):-1].strip()
flow_shortcuts = {
"q": "request",
"s": "response",
"cc": "client_conn",
"sc": "server_conn",
}
def is_addr(v):
return isinstance(v, tuple) and len(v) > 1
def extract(cut: str, f: flow.Flow) -> typing.Union[str, bytes]:
path = cut.split(".")
current = f # type: typing.Any
for i, spec in enumerate(path):
if spec.startswith("_"):
raise exceptions.CommandError("Can't access internal attribute %s" % spec)
if isinstance(current, flow.Flow):
spec = flow_shortcuts.get(spec, spec)
part = getattr(current, spec, None)
if i == len(path) - 1:
if spec == "port" and is_addr(current):
return str(current[1])
if spec == "host" and is_addr(current):
return str(current[0])
elif spec.startswith("header["):
return current.headers.get(headername(spec), "")
elif isinstance(part, bytes):
return part
elif isinstance(part, bool):
return "true" if part else "false"
elif isinstance(part, certs.SSLCert):
return part.to_pem().decode("ascii")
current = part
return str(current or "")
def parse_cutspec(s: str) -> typing.Tuple[str, typing.Sequence[str]]:
"""
Returns (flowspec, [cuts]).
Raises exceptions.CommandError if input is invalid.
"""
parts = s.split("|", maxsplit=1)
flowspec = "@all"
if len(parts) == 2:
flowspec = parts[1].strip()
cuts = parts[0]
cutparts = [i.strip() for i in cuts.split(",") if i.strip()]
if len(cutparts) == 0:
raise exceptions.CommandError("Invalid cut specification.")
return flowspec, cutparts
class Cut:
@command.command("cut")
def cut(self, cutspec: str) -> command.Cuts:
"""
Resolve a cut specification of the form "cuts|flowspec". The cuts
are a comma-separated list of cut snippets. Cut snippets are
attribute paths from the base of the flow object, with a few
conveniences - "q", "s", "cc" and "sc" are shortcuts for request,
response, client_conn and server_conn, "port" and "host" retrieve
parts of an address tuple, ".header[key]" retrieves a header value.
Return values converted sensibly: SSL certicates are converted to PEM
format, bools are "true" or "false", "bytes" are preserved, and all
other values are converted to strings. The flowspec is optional, and
if it is not specified, it is assumed to be @all.
"""
flowspec, cuts = parse_cutspec(cutspec)
flows = ctx.master.commands.call_args("view.resolve", [flowspec])
ret = []
for f in flows:
ret.append([extract(c, f) for c in cuts])
return ret
@command.command("cut.save")
def save(self, cuts: command.Cuts, path: str) -> None:
"""
Save cuts to file. If there are multiple rows or columns, the format
is UTF-8 encoded CSV. If there is exactly one row and one column,
the data is written to file as-is, with raw bytes preserved. If the
path is prefixed with a "+", values are appended if there is an
existing file.
"""
append = False
if path.startswith("+"):
append = True
path = path[1:]
if len(cuts) == 1 and len(cuts[0]) == 1:
with open(path, "ab" if append else "wb") as fp:
if fp.tell() > 0:
# We're appending to a file that already exists and has content
fp.write(b"\n")
v = cuts[0][0]
if isinstance(v, bytes):
fp.write(v)
else:
fp.write(v.encode("utf8"))
ctx.log.alert("Saved single cut.")
else:
with open(path, "a" if append else "w", newline='', encoding="utf8") as fp:
writer = csv.writer(fp)
for r in cuts:
writer.writerow(
[strutils.always_str(c) or "" for c in r] # type: ignore
)
ctx.log.alert("Saved %s cuts as CSV." % len(cuts))
@command.command("cut.clip")
def clip(self, cuts: command.Cuts) -> None:
"""
Send cuts to the system clipboard.
"""
fp = io.StringIO(newline="")
if len(cuts) == 1 and len(cuts[0]) == 1:
v = cuts[0][0]
if isinstance(v, bytes):
fp.write(strutils.always_str(v))
else:
fp.write("utf8")
ctx.log.alert("Clipped single cut.")
else:
writer = csv.writer(fp)
for r in cuts:
writer.writerow(
[strutils.always_str(c) or "" for c in r] # type: ignore
)
ctx.log.alert("Clipped %s cuts as CSV." % len(cuts))
pyperclip.copy(fp.getvalue())

View File

@@ -1,38 +0,0 @@
import mitmproxy
class DisableH2C:
"""
We currently only support HTTP/2 over a TLS connection.
Some clients try to upgrade a connection from HTTP/1.1 to h2c. We need to
remove those headers to avoid protocol errors if one endpoints suddenly
starts sending HTTP/2 frames.
Some clients might use HTTP/2 Prior Knowledge to directly initiate a session
by sending the connection preface. We just kill those flows.
"""
def process_flow(self, f):
if f.request.headers.get('upgrade', '') == 'h2c':
mitmproxy.ctx.log.warn("HTTP/2 cleartext connections (h2c upgrade requests) are currently not supported.")
del f.request.headers['upgrade']
if 'connection' in f.request.headers:
del f.request.headers['connection']
if 'http2-settings' in f.request.headers:
del f.request.headers['http2-settings']
is_connection_preface = (
f.request.method == 'PRI' and
f.request.path == '*' and
f.request.http_version == 'HTTP/2.0'
)
if is_connection_preface:
f.kill()
mitmproxy.ctx.log.warn("Initiating HTTP/2 connections with prior knowledge are currently not supported.")
# Handlers
def request(self, f):
self.process_flow(f)

View File

@@ -0,0 +1,21 @@
class DisableH2CleartextUpgrade:
"""
We currently only support HTTP/2 over a TLS connection. Some clients try
to upgrade a connection from HTTP/1.1 to h2c, so we need to remove those
headers to avoid protocol errors if one endpoints suddenly starts sending
HTTP/2 frames.
"""
def process_flow(self, f):
if f.request.headers.get('upgrade', '') == 'h2c':
del f.request.headers['upgrade']
if 'connection' in f.request.headers:
del f.request.headers['connection']
if 'http2-settings' in f.request.headers:
del f.request.headers['http2-settings']
# Handlers
def request(self, f):
self.process_flow(f)

View File

@@ -29,18 +29,24 @@ def colorful(line, styles):
class Dumper:
def __init__(self, outfile=sys.stdout):
self.filter = None # type: flowfilter.TFilter
self.flow_detail = None # type: int
self.outfp = outfile # type: typing.io.TextIO
self.showhost = None # type: bool
self.default_contentview = "auto" # type: str
def configure(self, updated):
if "view_filter" in updated:
if ctx.options.view_filter:
self.filter = flowfilter.parse(ctx.options.view_filter)
def configure(self, options, updated):
if "filtstr" in updated:
if options.filtstr:
self.filter = flowfilter.parse(options.filtstr)
if not self.filter:
raise exceptions.OptionsError(
"Invalid filter expression: %s" % ctx.options.view_filter
"Invalid filter expression: %s" % options.filtstr
)
else:
self.filter = None
self.flow_detail = options.flow_detail
self.showhost = options.showhost
self.default_contentview = options.default_contentview
def echo(self, text, ident=None, **style):
if ident:
@@ -61,13 +67,13 @@ class Dumper:
def _echo_message(self, message):
_, lines, error = contentviews.get_message_content_view(
ctx.options.default_contentview,
self.default_contentview,
message
)
if error:
ctx.log.debug(error)
if ctx.options.flow_detail == 3:
if self.flow_detail == 3:
lines_to_echo = itertools.islice(lines, 70)
else:
lines_to_echo = lines
@@ -89,14 +95,14 @@ class Dumper:
if next(lines, None):
self.echo("(cut off)", ident=4, dim=True)
if ctx.options.flow_detail >= 2:
if self.flow_detail >= 2:
self.echo("")
def _echo_request_line(self, flow):
if flow.client_conn:
client = click.style(
strutils.escape_control_characters(
human.format_address(flow.client_conn.address)
repr(flow.client_conn.address)
)
)
elif flow.request.is_replay:
@@ -115,12 +121,12 @@ class Dumper:
fg=method_color,
bold=True
)
if ctx.options.showhost:
if self.showhost:
url = flow.request.pretty_url
else:
url = flow.request.url
terminalWidthLimit = max(shutil.get_terminal_size()[0] - 25, 50)
if ctx.options.flow_detail < 1 and len(url) > terminalWidthLimit:
if self.flow_detail < 1 and len(url) > terminalWidthLimit:
url = url[:terminalWidthLimit] + ""
url = click.style(strutils.escape_control_characters(url), bold=True)
@@ -170,7 +176,7 @@ class Dumper:
size = click.style(size, bold=True)
arrows = click.style(" <<", bold=True)
if ctx.options.flow_detail == 1:
if self.flow_detail == 1:
# This aligns the HTTP response code with the HTTP request method:
# 127.0.0.1:59519: GET http://example.com/
# << 304 Not Modified 0b
@@ -188,16 +194,16 @@ class Dumper:
def echo_flow(self, f):
if f.request:
self._echo_request_line(f)
if ctx.options.flow_detail >= 2:
if self.flow_detail >= 2:
self._echo_headers(f.request.headers)
if ctx.options.flow_detail >= 3:
if self.flow_detail >= 3:
self._echo_message(f.request)
if f.response:
self._echo_response_line(f)
if ctx.options.flow_detail >= 2:
if self.flow_detail >= 2:
self._echo_headers(f.response.headers)
if ctx.options.flow_detail >= 3:
if self.flow_detail >= 3:
self._echo_message(f.response)
if f.error:
@@ -205,7 +211,7 @@ class Dumper:
self.echo(" << {}".format(msg), bold=True, fg="red")
def match(self, f):
if ctx.options.flow_detail == 0:
if self.flow_detail == 0:
return False
if not self.filter:
return True
@@ -233,7 +239,7 @@ class Dumper:
if self.match(f):
message = f.messages[-1]
self.echo(f.message_info(message))
if ctx.options.flow_detail >= 3:
if self.flow_detail >= 3:
self._echo_message(message)
def websocket_end(self, f):
@@ -261,5 +267,5 @@ class Dumper:
server=repr(f.server_conn.address),
direction=direction,
))
if ctx.options.flow_detail >= 3:
if self.flow_detail >= 3:
self._echo_message(message)

View File

@@ -1,75 +0,0 @@
import typing
from mitmproxy import command
from mitmproxy import flow
from mitmproxy import exceptions
from mitmproxy.utils import strutils
from mitmproxy.net.http.http1 import assemble
import pyperclip
def curl_command(f: flow.Flow) -> str:
if not hasattr(f, "request"):
raise exceptions.CommandError("Can't export flow with no request.")
data = "curl "
request = f.request.copy() # type: ignore
request.decode(strict=False)
for k, v in request.headers.items(multi=True):
data += "-H '%s:%s' " % (k, v)
if request.method != "GET":
data += "-X %s " % request.method
data += "'%s'" % request.url
if request.content:
data += " --data-binary '%s'" % strutils.bytes_to_escaped_str(
request.content,
escape_single_quotes=True
)
return data
def raw(f: flow.Flow) -> bytes:
if not hasattr(f, "request"):
raise exceptions.CommandError("Can't export flow with no request.")
return assemble.assemble_request(f.request) # type: ignore
formats = dict(
curl = curl_command,
raw = raw,
)
class Export():
@command.command("export.formats")
def formats(self) -> typing.Sequence[str]:
"""
Return a list of the supported export formats.
"""
return list(sorted(formats.keys()))
@command.command("export.file")
def file(self, fmt: str, f: flow.Flow, path: str) -> None:
"""
Export a flow to path.
"""
if fmt not in formats:
raise exceptions.CommandError("No such export format: %s" % fmt)
func = formats[fmt] # type: typing.Any
v = func(f)
with open(path, "wb") as fp:
if isinstance(v, bytes):
fp.write(v)
else:
fp.write(v.encode("utf-8"))
@command.command("export.clip")
def clip(self, fmt: str, f: flow.Flow) -> None:
"""
Export a flow to the system clipboard.
"""
if fmt not in formats:
raise exceptions.CommandError("No such export format: %s" % fmt)
func = formats[fmt] # type: typing.Any
v = strutils.always_str(func(f))
pyperclip.copy(v)

View File

@@ -1,21 +1,20 @@
from mitmproxy import flowfilter
from mitmproxy import exceptions
from mitmproxy import ctx
class Intercept:
def __init__(self):
self.filt = None
def configure(self, updated):
def configure(self, opts, updated):
if "intercept" in updated:
if not ctx.options.intercept:
if not opts.intercept:
self.filt = None
return
self.filt = flowfilter.parse(ctx.options.intercept)
self.filt = flowfilter.parse(opts.intercept)
if not self.filt:
raise exceptions.OptionsError(
"Invalid interception filter: %s" % ctx.options.intercept
"Invalid interception filter: %s" % opts.intercept
)
def process_flow(self, f):

View File

@@ -1,7 +0,0 @@
from mitmproxy import ctx
class KeepServing:
def event_processing_complete(self):
if not ctx.master.options.keepserving:
ctx.master.shutdown()

View File

@@ -1,18 +1,17 @@
from mitmproxy.addons import wsgiapp
from mitmproxy.addons.onboardingapp import app
from mitmproxy import ctx
class Onboarding(wsgiapp.WSGIApp):
name = "onboarding"
def __init__(self):
super().__init__(app.Adapter(app.application), None, None)
self.enabled = False
def configure(self, updated):
self.host = ctx.options.onboarding_host
self.port = ctx.options.onboarding_port
def configure(self, options, updated):
self.host = options.onboarding_host
self.port = options.onboarding_port
self.enabled = options.onboarding
def request(self, f):
if ctx.options.onboarding:
if self.enabled:
super().request(f)

View File

@@ -1,8 +1,7 @@
import binascii
import weakref
import ldap3
from typing import Optional
from typing import MutableMapping # noqa
from typing import Set # noqa
from typing import Tuple
import passlib.apache
@@ -11,7 +10,6 @@ import mitmproxy.net.http
from mitmproxy import connections # noqa
from mitmproxy import exceptions
from mitmproxy import http
from mitmproxy import ctx
from mitmproxy.net.http import status_codes
REALM = "mitmproxy"
@@ -47,12 +45,12 @@ class ProxyAuth:
self.nonanonymous = False
self.htpasswd = None
self.singleuser = None
self.ldapserver = None
self.authenticated = weakref.WeakKeyDictionary() # type: MutableMapping[connections.ClientConnection, Tuple[str, str]]
self.mode = None
self.authenticated = weakref.WeakSet() # type: Set[connections.ClientConnection]
"""Contains all connections that are permanently authenticated after an HTTP CONNECT"""
def enabled(self) -> bool:
return any([self.nonanonymous, self.htpasswd, self.singleuser, self.ldapserver])
return any([self.nonanonymous, self.htpasswd, self.singleuser])
def is_proxy_auth(self) -> bool:
"""
@@ -60,7 +58,7 @@ class ProxyAuth:
- True, if authentication is done as if mitmproxy is a proxy
- False, if authentication is done as if mitmproxy is a HTTP server
"""
return ctx.options.mode in ("regular", "upstream")
return self.mode in ("regular", "upstream")
def which_auth_header(self) -> str:
if self.is_proxy_auth():
@@ -101,20 +99,7 @@ class ProxyAuth:
elif self.htpasswd:
if self.htpasswd.check_password(username, password):
return username, password
elif self.ldapserver:
if not username or not password:
return None
dn = ctx.options.proxyauth.split(":")[2]
parts = dn.split("?")
conn = ldap3.Connection(
self.ldapserver,
parts[0] + username + parts[1],
password,
auto_bind=True)
if conn:
conn.search(parts[1][1:], '(' + parts[0] + username + ')', attributes=['objectclass'])
if ctx.options.proxyauth.split(":")[3] in conn.entries[0]['objectclass']:
return username, password
return None
def authenticate(self, f: http.HTTPFlow) -> bool:
@@ -128,51 +113,39 @@ class ProxyAuth:
return False
# Handlers
def configure(self, updated):
if "proxyauth" in updated:
self.nonanonymous = False
self.singleuser = None
self.htpasswd = None
self.ldapserver = None
if ctx.options.proxyauth:
if ctx.options.proxyauth == "any":
self.nonanonymous = True
elif ctx.options.proxyauth.startswith("@"):
p = ctx.options.proxyauth[1:]
try:
self.htpasswd = passlib.apache.HtpasswdFile(p)
except (ValueError, OSError) as v:
raise exceptions.OptionsError(
"Could not open htpasswd file: %s" % p
)
elif ctx.options.proxyauth.startswith("ldap"):
parts = ctx.options.proxyauth.split(":")
if len(parts) != 4:
raise exceptions.OptionsError(
"Invalid ldap specification"
)
if parts[0] == "ldaps":
server = ldap3.Server(parts[1], use_ssl=True)
elif parts[0] == "ldap":
server = ldap3.Server(parts[1])
else:
raise exceptions.OptionsError(
"Invalid ldap specfication on the first part"
)
self.ldapserver = server
else:
parts = ctx.options.proxyauth.split(':')
if len(parts) != 2:
raise exceptions.OptionsError(
"Invalid single-user auth specification."
)
self.singleuser = parts
def configure(self, options, updated):
if "auth_nonanonymous" in updated:
self.nonanonymous = options.auth_nonanonymous
if "auth_singleuser" in updated:
if options.auth_singleuser:
parts = options.auth_singleuser.split(':')
if len(parts) != 2:
raise exceptions.OptionsError(
"Invalid single-user auth specification."
)
self.singleuser = parts
else:
self.singleuser = None
if "auth_htpasswd" in updated:
if options.auth_htpasswd:
try:
self.htpasswd = passlib.apache.HtpasswdFile(
options.auth_htpasswd
)
except (ValueError, OSError) as v:
raise exceptions.OptionsError(
"Could not open htpasswd file: %s" % v
)
else:
self.htpasswd = None
if "mode" in updated:
self.mode = options.mode
if self.enabled():
if ctx.options.mode == "transparent":
if options.mode == "transparent":
raise exceptions.OptionsError(
"Proxy Authentication not supported in transparent mode."
)
if ctx.options.mode == "socks5":
if options.mode == "socks5":
raise exceptions.OptionsError(
"Proxy Authentication not supported in SOCKS mode. "
"https://github.com/mitmproxy/mitmproxy/issues/738"
@@ -182,12 +155,11 @@ class ProxyAuth:
def http_connect(self, f: http.HTTPFlow) -> None:
if self.enabled():
if self.authenticate(f):
self.authenticated[f.client_conn] = f.metadata["proxyauth"]
self.authenticated.add(f.client_conn)
def requestheaders(self, f: http.HTTPFlow) -> None:
if self.enabled():
# Is this connection authenticated by a previous HTTP CONNECT?
if f.client_conn in self.authenticated:
f.metadata["proxyauth"] = self.authenticated[f.client_conn]
return
self.authenticate(f)

View File

@@ -1,56 +0,0 @@
import os.path
import sys
import typing
from mitmproxy import ctx
from mitmproxy import exceptions
from mitmproxy import io
class ReadFile:
"""
An addon that handles reading from file on startup.
"""
def load_flows(self, fo: typing.IO[bytes]) -> int:
cnt = 0
freader = io.FlowReader(fo)
try:
for flow in freader.stream():
ctx.master.load_flow(flow)
cnt += 1
except (IOError, exceptions.FlowReadException) as e:
if cnt:
ctx.log.warn("Flow file corrupted - loaded %i flows." % cnt)
else:
ctx.log.error("Flow file corrupted.")
raise exceptions.FlowReadException(str(e)) from e
else:
return cnt
def load_flows_from_path(self, path: str) -> int:
path = os.path.expanduser(path)
try:
with open(path, "rb") as f:
return self.load_flows(f)
except IOError as e:
ctx.log.error("Cannot load flows: {}".format(e))
raise exceptions.FlowReadException(str(e)) from e
def running(self):
if ctx.options.rfile:
try:
self.load_flows_from_path(ctx.options.rfile)
except exceptions.FlowReadException as e:
raise exceptions.OptionsError(e) from e
finally:
ctx.master.addons.trigger("processing_complete")
class ReadFileStdin(ReadFile):
"""Support the special case of "-" for reading from stdin"""
def load_flows_from_path(self, path: str) -> int:
if path == "-":
return self.load_flows(sys.stdin.buffer)
else:
return super().load_flows_from_path(path)

View File

@@ -1,4 +1,3 @@
import os
import re
from mitmproxy import exceptions
@@ -43,22 +42,25 @@ def parse_hook(s):
return patt, a, b
class Replace:
class _ReplaceBase:
def __init__(self):
self.lst = []
def configure(self, updated):
def configure(self, options, updated):
"""
.replacements is a list of tuples (fpat, rex, s):
fpatt: a string specifying a filter pattern.
rex: a regular expression, as string.
s: the replacement string
rex: a regular expression, as bytes.
s: the replacement string, as bytes
"""
if "replacements" in updated:
if self.optionName in updated:
lst = []
for rep in ctx.options.replacements:
fpatt, rex, s = parse_hook(rep)
for rep in getattr(options, self.optionName):
if isinstance(rep, str):
fpatt, rex, s = parse_hook(rep)
else:
fpatt, rex, s = rep
flt = flowfilter.parse(fpatt)
if not flt:
@@ -66,16 +68,11 @@ class Replace:
"Invalid filter pattern: %s" % fpatt
)
try:
# We should ideally escape here before trying to compile
re.compile(rex)
except re.error as e:
raise exceptions.OptionsError(
"Invalid regular expression: %s - %s" % (rex, str(e))
)
if s.startswith("@") and not os.path.isfile(s[1:]):
raise exceptions.OptionsError(
"Invalid file path: {}".format(s[1:])
)
lst.append((rex, s, flt))
self.lst = lst
@@ -95,13 +92,21 @@ class Replace:
if not flow.reply.has_message:
self.execute(flow)
class Replace(_ReplaceBase):
optionName = "replacements"
def replace(self, obj, rex, s):
if s.startswith("@"):
s = os.path.expanduser(s[1:])
try:
with open(s, "rb") as f:
s = f.read()
except IOError:
ctx.log.warn("Could not read replacement file: %s" % s)
return
obj.replace(rex, s, flags=re.DOTALL)
class ReplaceFile(_ReplaceBase):
optionName = "replacement_files"
def replace(self, obj, rex, s):
try:
v = open(s, "rb").read()
except IOError as e:
ctx.log.warn("Could not read replacement file: %s" % s)
return
obj.replace(rex, v, flags=re.DOTALL)

View File

@@ -1,93 +0,0 @@
import os.path
import typing
from mitmproxy import exceptions
from mitmproxy import flowfilter
from mitmproxy import io
from mitmproxy import ctx
from mitmproxy import flow
class Save:
def __init__(self):
self.stream = None
self.filt = None
self.active_flows = set() # type: Set[flow.Flow]
def open_file(self, path):
if path.startswith("+"):
path = path[1:]
mode = "ab"
else:
mode = "wb"
path = os.path.expanduser(path)
return open(path, mode)
def start_stream_to_path(self, path, flt):
try:
f = self.open_file(path)
except IOError as v:
raise exceptions.OptionsError(str(v))
self.stream = io.FilteredFlowWriter(f, flt)
self.active_flows = set()
def configure(self, updated):
# We're already streaming - stop the previous stream and restart
if "save_stream_filter" in updated:
if ctx.options.save_stream_filter:
self.filt = flowfilter.parse(ctx.options.save_stream_filter)
if not self.filt:
raise exceptions.OptionsError(
"Invalid filter specification: %s" % ctx.options.save_stream_filter
)
else:
self.filt = None
if "save_stream_file" in updated:
if self.stream:
self.done()
if ctx.options.save_stream_file:
self.start_stream_to_path(ctx.options.save_stream_file, self.filt)
def save(self, flows: typing.Sequence[flow.Flow], path: str) -> None:
"""
Save flows to a file. If the path starts with a +, flows are
appended to the file, otherwise it is over-written.
"""
try:
f = self.open_file(path)
except IOError as v:
raise exceptions.CommandError(v) from v
stream = io.FlowWriter(f)
for i in flows:
stream.add(i)
f.close()
ctx.log.alert("Saved %s flows." % len(flows))
def load(self, l):
l.add_command("save.file", self.save)
def tcp_start(self, flow):
if self.stream:
self.active_flows.add(flow)
def tcp_end(self, flow):
if self.stream:
self.stream.add(flow)
self.active_flows.discard(flow)
def response(self, flow):
if self.stream:
self.stream.add(flow)
self.active_flows.discard(flow)
def request(self, flow):
if self.stream:
self.active_flows.add(flow)
def done(self):
if self.stream:
for f in self.active_flows:
self.stream.add(f)
self.active_flows = set([])
self.stream.fo.close()
self.stream = None

View File

@@ -1,145 +1,269 @@
import contextlib
import os
import importlib
import time
import shlex
import sys
import typing
import threading
import traceback
import types
from mitmproxy import addonmanager
from mitmproxy import exceptions
from mitmproxy import flow
from mitmproxy import command
from mitmproxy import eventsequence
from mitmproxy import ctx
from mitmproxy import eventsequence
def load_script(actx, path):
if not os.path.exists(path):
ctx.log.info("No such file: %s" % path)
return
loader = importlib.machinery.SourceFileLoader(os.path.basename(path), path)
import watchdog.events
from watchdog.observers import polling
def parse_command(command):
"""
Returns a (path, args) tuple.
"""
if not command or not command.strip():
raise ValueError("Empty script command.")
# Windows: escape all backslashes in the path.
if os.name == "nt": # pragma: no cover
backslashes = shlex.split(command, posix=False)[0].count("\\")
command = command.replace("\\", "\\\\", backslashes)
args = shlex.split(command) # pragma: no cover
args[0] = os.path.expanduser(args[0])
if not os.path.exists(args[0]):
raise ValueError(
("Script file not found: %s.\r\n"
"If your script path contains spaces, "
"make sure to wrap it in additional quotes, e.g. -s \"'./foo bar/baz.py' --args\".") %
args[0])
elif os.path.isdir(args[0]):
raise ValueError("Not a file: %s" % args[0])
return args[0], args[1:]
def cut_traceback(tb, func_name):
"""
Cut off a traceback at the function with the given name.
The func_name's frame is excluded.
Args:
tb: traceback object, as returned by sys.exc_info()[2]
func_name: function name
Returns:
Reduced traceback.
"""
tb_orig = tb
for _, _, fname, _ in traceback.extract_tb(tb):
tb = tb.tb_next
if fname == func_name:
break
if tb is None:
# We could not find the method, take the full stack trace.
# This may happen on some Python interpreters/flavors (e.g. PyInstaller).
return tb_orig
else:
return tb
@contextlib.contextmanager
def scriptenv(path, args):
oldargs = sys.argv
sys.argv = [path] + args
script_dir = os.path.dirname(os.path.abspath(path))
sys.path.append(script_dir)
try:
oldpath = sys.path
sys.path.insert(0, os.path.dirname(path))
with addonmanager.safecall():
m = loader.load_module()
if not getattr(m, "name", None):
m.name = path
return m
yield
except SystemExit as v:
ctx.log.error("Script exited with code %s" % v.code)
except Exception:
etype, value, tb = sys.exc_info()
tb = cut_traceback(tb, "scriptenv").tb_next
ctx.log.error(
"Script error: %s" % "".join(
traceback.format_exception(etype, value, tb)
)
)
finally:
sys.path[:] = oldpath
sys.argv = oldargs
sys.path.pop()
def load_script(path, args):
with open(path, "rb") as f:
try:
code = compile(f.read(), path, 'exec')
except SyntaxError as e:
ctx.log.error(
"Script error: %s line %s: %s" % (
e.filename, e.lineno, e.msg
)
)
return
ns = {'__file__': os.path.abspath(path)}
with scriptenv(path, args):
exec(code, ns)
return types.SimpleNamespace(**ns)
class ReloadHandler(watchdog.events.FileSystemEventHandler):
def __init__(self, callback):
self.callback = callback
def filter(self, event):
"""
Returns True only when .py file is changed
"""
if event.is_directory:
return False
if os.path.basename(event.src_path).startswith("."):
return False
if event.src_path.endswith(".py"):
return True
return False
def on_modified(self, event):
if self.filter(event):
self.callback()
def on_created(self, event):
if self.filter(event):
self.callback()
class Script:
"""
An addon that manages a single script.
"""
ReloadInterval = 2
def __init__(self, command):
self.name = command
def __init__(self, path):
self.name = "scriptmanager:" + path
self.path = path
self.fullpath = os.path.expanduser(path)
self.command = command
self.path, self.args = parse_command(command)
self.ns = None
self.observer = None
self.dead = False
self.last_load = 0
self.last_mtime = 0
if not os.path.isfile(self.fullpath):
raise exceptions.OptionsError("No such script: %s" % path)
self.last_options = None
self.should_reload = threading.Event()
@property
def addons(self):
return [self.ns] if self.ns else []
for i in eventsequence.Events:
if not hasattr(self, i):
def mkprox():
evt = i
def prox(*args, **kwargs):
self.run(evt, *args, **kwargs)
return prox
setattr(self, i, mkprox())
def run(self, name, *args, **kwargs):
# It's possible for ns to be un-initialised if we failed during
# configure
if self.ns is not None and not self.dead:
func = getattr(self.ns, name, None)
if func:
with scriptenv(self.path, self.args):
return func(*args, **kwargs)
def reload(self):
self.should_reload.set()
def load_script(self):
self.ns = load_script(self.path, self.args)
ret = self.run("start")
if ret:
self.ns = ret
self.run("start")
def tick(self):
if time.time() - self.last_load > self.ReloadInterval:
mtime = os.stat(self.fullpath).st_mtime
if mtime > self.last_mtime:
ctx.log.info("Loading script: %s" % self.path)
if self.ns:
ctx.master.addons.remove(self.ns)
self.ns = load_script(ctx, self.fullpath)
if self.ns:
# We're already running, so we have to explicitly register and
# configure the addon
ctx.master.addons.register(self.ns)
ctx.master.addons.invoke_addon(self.ns, "running")
ctx.master.addons.invoke_addon(
self.ns,
"configure",
ctx.options.keys()
)
self.last_load = time.time()
self.last_mtime = mtime
if self.should_reload.is_set():
self.should_reload.clear()
ctx.log.info("Reloading script: %s" % self.name)
self.ns = load_script(self.path, self.args)
self.start()
self.configure(self.last_options, self.last_options.keys())
else:
self.run("tick")
def start(self):
self.load_script()
def configure(self, options, updated):
self.last_options = options
if not self.observer:
self.observer = polling.PollingObserver()
# Bind the handler to the real underlying master object
self.observer.schedule(
ReloadHandler(self.reload),
os.path.dirname(self.path) or "."
)
self.observer.start()
self.run("configure", options, updated)
def done(self):
self.run("done")
self.dead = True
class ScriptLoader:
"""
An addon that manages loading scripts from options.
"""
def __init__(self):
self.is_running = False
self.addons = []
def running(self):
self.is_running = True
@command.command("script.run")
def script_run(self, flows: typing.Sequence[flow.Flow], path: str) -> None:
"""
Run a script on the specified flows. The script is loaded with
default options, and all lifecycle events for each flow are
simulated.
"""
def run_once(self, command, flows):
try:
s = Script(path)
l = addonmanager.Loader(ctx.master)
ctx.master.addons.invoke_addon(s, "load", l)
ctx.master.addons.invoke_addon(s, "configure", ctx.options.keys())
# Script is loaded on the first tick
ctx.master.addons.invoke_addon(s, "tick")
for f in flows:
for evt, arg in eventsequence.iterate(f):
ctx.master.addons.invoke_addon(s, evt, arg)
except exceptions.OptionsError as e:
raise exceptions.CommandError("Error running script: %s" % e) from e
sc = Script(command)
except ValueError as e:
raise ValueError(str(e))
sc.load_script()
for f in flows:
for evt, o in eventsequence.iterate(f):
sc.run(evt, o)
sc.done()
return sc
def configure(self, updated):
def configure(self, options, updated):
if "scripts" in updated:
for s in ctx.options.scripts:
if ctx.options.scripts.count(s) > 1:
for s in options.scripts:
if options.scripts.count(s) > 1:
raise exceptions.OptionsError("Duplicate script: %s" % s)
for a in self.addons[:]:
if a.path not in ctx.options.scripts:
for a in ctx.master.addons.chain[:]:
if isinstance(a, Script) and a.name not in options.scripts:
ctx.log.info("Un-loading script: %s" % a.name)
ctx.master.addons.remove(a)
self.addons.remove(a)
# The machinations below are to ensure that:
# - Scripts remain in the same order
# - Scripts are listed directly after the script addon. This is
# needed to ensure that interactions with, for instance, flow
# serialization remains correct.
# - Scripts are not initialized un-necessarily. If only a
# script's order in the script list has changed, it is just
# moved.
# script's order in the script list has changed, it should simply
# be moved.
current = {}
for a in self.addons:
current[a.path] = a
for a in ctx.master.addons.chain[:]:
if isinstance(a, Script):
current[a.name] = a
ctx.master.addons.chain.remove(a)
ordered = []
newscripts = []
for s in ctx.options.scripts:
for s in options.scripts:
if s in current:
ordered.append(current[s])
else:
sc = Script(s)
ctx.log.info("Loading script: %s" % s)
try:
sc = Script(s)
except ValueError as e:
raise exceptions.OptionsError(str(e))
ordered.append(sc)
newscripts.append(sc)
self.addons = ordered
ochain = ctx.master.addons.chain
pos = ochain.index(self)
ctx.master.addons.chain = ochain[:pos + 1] + ordered + ochain[pos + 1:]
for s in newscripts:
ctx.master.addons.register(s)
if self.is_running:
# If we're already running, we configure and tell the addon
# we're up and running.
ctx.master.addons.invoke_addon(s, "running")
ctx.master.addons.startup(s)

View File

@@ -1,50 +1,29 @@
import hashlib
import urllib
import typing
from typing import Any # noqa
from typing import List # noqa
from mitmproxy import ctx
from mitmproxy import flow
from mitmproxy import exceptions
from mitmproxy import io
from mitmproxy import command
class ServerPlayback:
def __init__(self):
self.options = None
self.flowmap = {}
self.stop = False
self.final_flow = None
self.configured = False
@command.command("replay.server")
def load_flows(self, flows: typing.Sequence[flow.Flow]) -> None:
"""
Replay server responses from flows.
"""
self.flowmap = {}
def load(self, flows):
for i in flows:
if i.response: # type: ignore
if i.response:
l = self.flowmap.setdefault(self._hash(i), [])
l.append(i)
ctx.master.addons.trigger("update", [])
@command.command("replay.server.file")
def load_file(self, path: str) -> None:
try:
flows = io.read_flows_from_paths([path])
except exceptions.FlowReadException as e:
raise exceptions.CommandError(str(e))
self.load_flows(flows)
@command.command("replay.server.stop")
def clear(self) -> None:
"""
Stop server replay.
"""
def clear(self):
self.flowmap = {}
ctx.master.addons.trigger("update", [])
def count(self):
return sum([len(i) for i in self.flowmap.values()])
@@ -59,27 +38,27 @@ class ServerPlayback:
queriesArray = urllib.parse.parse_qsl(query, keep_blank_values=True)
key = [str(r.port), str(r.scheme), str(r.method), str(path)] # type: List[Any]
if not ctx.options.server_replay_ignore_content:
if ctx.options.server_replay_ignore_payload_params and r.multipart_form:
if not self.options.server_replay_ignore_content:
if self.options.server_replay_ignore_payload_params and r.multipart_form:
key.extend(
(k, v)
for k, v in r.multipart_form.items(multi=True)
if k.decode(errors="replace") not in ctx.options.server_replay_ignore_payload_params
if k.decode(errors="replace") not in self.options.server_replay_ignore_payload_params
)
elif ctx.options.server_replay_ignore_payload_params and r.urlencoded_form:
elif self.options.server_replay_ignore_payload_params and r.urlencoded_form:
key.extend(
(k, v)
for k, v in r.urlencoded_form.items(multi=True)
if k not in ctx.options.server_replay_ignore_payload_params
if k not in self.options.server_replay_ignore_payload_params
)
else:
key.append(str(r.raw_content))
if not ctx.options.server_replay_ignore_host:
if not self.options.server_replay_ignore_host:
key.append(r.host)
filtered = []
ignore_params = ctx.options.server_replay_ignore_params or []
ignore_params = self.options.server_replay_ignore_params or []
for p in queriesArray:
if p[0] not in ignore_params:
filtered.append(p)
@@ -87,9 +66,9 @@ class ServerPlayback:
key.append(p[0])
key.append(p[1])
if ctx.options.server_replay_use_headers:
if self.options.server_replay_use_headers:
headers = []
for i in ctx.options.server_replay_use_headers:
for i in self.options.server_replay_use_headers:
v = r.headers.get(i)
headers.append((i, v))
key.append(headers)
@@ -104,7 +83,7 @@ class ServerPlayback:
"""
hsh = self._hash(request)
if hsh in self.flowmap:
if ctx.options.server_replay_nopop:
if self.options.server_replay_nopop:
return self.flowmap[hsh][0]
else:
ret = self.flowmap[hsh].pop(0)
@@ -112,18 +91,20 @@ class ServerPlayback:
del self.flowmap[hsh]
return ret
def configure(self, updated):
if not self.configured and ctx.options.server_replay:
self.configured = True
try:
flows = io.read_flows_from_paths(ctx.options.server_replay)
except exceptions.FlowReadException as e:
raise exceptions.OptionsError(str(e))
self.load_flows(flows)
def configure(self, options, updated):
self.options = options
if "server_replay" in updated:
self.clear()
if options.server_replay:
try:
flows = io.read_flows_from_paths(options.server_replay)
except exceptions.FlowReadException as e:
raise exceptions.OptionsError(str(e))
self.load(flows)
def tick(self):
if self.stop and not self.final_flow.live:
ctx.master.addons.trigger("processing_complete")
ctx.master.shutdown()
def request(self, f):
if self.flowmap:
@@ -131,13 +112,13 @@ class ServerPlayback:
if rflow:
response = rflow.response.copy()
response.is_replay = True
if ctx.options.refresh_server_playback:
if self.options.refresh_server_playback:
response.refresh()
f.response = response
if not self.flowmap:
if not self.flowmap and not self.options.keepserving:
self.final_flow = f
self.stop = True
elif ctx.options.replay_kill_extra:
elif self.options.replay_kill_extra:
ctx.log.warn(
"server_playback: killed non-replay request {}".format(
f.request.url

View File

@@ -1,6 +1,5 @@
from mitmproxy import exceptions
from mitmproxy import flowfilter
from mitmproxy import ctx
def parse_setheader(s):
@@ -44,11 +43,21 @@ class SetHeaders:
def __init__(self):
self.lst = []
def configure(self, updated):
def configure(self, options, updated):
"""
options.setheaders is a tuple of (fpatt, header, value)
fpatt: String specifying a filter pattern.
header: Header name.
value: Header value string
"""
if "setheaders" in updated:
self.lst = []
for shead in ctx.options.setheaders:
fpatt, header, value = parse_setheader(shead)
for shead in options.setheaders:
if isinstance(shead, str):
fpatt, header, value = parse_setheader(shead)
else:
fpatt, header, value = shead
flt = flowfilter.parse(fpatt)
if not flt:

View File

@@ -1,6 +1,5 @@
from mitmproxy import exceptions
from mitmproxy import flowfilter
from mitmproxy import ctx
class StickyAuth:
@@ -8,13 +7,13 @@ class StickyAuth:
self.flt = None
self.hosts = {}
def configure(self, updated):
def configure(self, options, updated):
if "stickyauth" in updated:
if ctx.options.stickyauth:
flt = flowfilter.parse(ctx.options.stickyauth)
if options.stickyauth:
flt = flowfilter.parse(options.stickyauth)
if not flt:
raise exceptions.OptionsError(
"stickyauth: invalid filter expression: %s" % ctx.options.stickyauth
"stickyauth: invalid filter expression: %s" % options.stickyauth
)
self.flt = flt
else:

View File

@@ -1,14 +1,13 @@
import collections
from http import cookiejar
from typing import List, Tuple, Dict, Optional # noqa
from mitmproxy import http, flowfilter, ctx, exceptions
from mitmproxy.net.http import cookies
TOrigin = Tuple[str, int, str]
from mitmproxy import exceptions
from mitmproxy import flowfilter
def ckey(attrs: Dict[str, str], f: http.HTTPFlow) -> TOrigin:
def ckey(attrs, f):
"""
Returns a (domain, port, path) tuple.
"""
@@ -21,32 +20,32 @@ def ckey(attrs: Dict[str, str], f: http.HTTPFlow) -> TOrigin:
return (domain, f.request.port, path)
def domain_match(a: str, b: str) -> bool:
if cookiejar.domain_match(a, b): # type: ignore
def domain_match(a, b):
if cookiejar.domain_match(a, b):
return True
elif cookiejar.domain_match(a, b.strip(".")): # type: ignore
elif cookiejar.domain_match(a, b.strip(".")):
return True
return False
class StickyCookie:
def __init__(self):
self.jar = collections.defaultdict(dict) # type: Dict[TOrigin, Dict[str, str]]
self.flt = None # type: Optional[flowfilter.TFilter]
self.jar = collections.defaultdict(dict)
self.flt = None
def configure(self, updated):
def configure(self, options, updated):
if "stickycookie" in updated:
if ctx.options.stickycookie:
flt = flowfilter.parse(ctx.options.stickycookie)
if options.stickycookie:
flt = flowfilter.parse(options.stickycookie)
if not flt:
raise exceptions.OptionsError(
"stickycookie: invalid filter expression: %s" % ctx.options.stickycookie
"stickycookie: invalid filter expression: %s" % options.stickycookie
)
self.flt = flt
else:
self.flt = None
def response(self, flow: http.HTTPFlow):
def response(self, flow):
if self.flt:
for name, (value, attrs) in flow.response.cookies.items(multi=True):
# FIXME: We now know that Cookie.py screws up some cookies with
@@ -63,21 +62,24 @@ class StickyCookie:
if not self.jar[dom_port_path]:
self.jar.pop(dom_port_path, None)
else:
self.jar[dom_port_path][name] = value
b = attrs.copy()
b.insert(0, name, value)
self.jar[dom_port_path][name] = b
def request(self, flow: http.HTTPFlow):
def request(self, flow):
if self.flt:
cookie_list = [] # type: List[Tuple[str,str]]
l = []
if flowfilter.match(self.flt, flow):
for (domain, port, path), c in self.jar.items():
for domain, port, path in self.jar.keys():
match = [
domain_match(flow.request.host, domain),
flow.request.port == port,
flow.request.path.startswith(path)
]
if all(match):
cookie_list.extend(c.items())
if cookie_list:
c = self.jar[(domain, port, path)]
l.extend([cookies.format_cookie_header(c[name].items(multi=True)) for name in c.keys()])
if l:
# FIXME: we need to formalise this...
flow.metadata["stickycookie"] = True
flow.request.headers["cookie"] = cookies.format_cookie_header(cookie_list)
flow.request.stickycookie = True
flow.request.headers["cookie"] = "; ".join(l)

View File

@@ -1,19 +1,14 @@
from mitmproxy.net.http import http1
from mitmproxy import exceptions
from mitmproxy import ctx
from mitmproxy.utils import human
class StreamBodies:
def __init__(self):
self.max_size = None
def configure(self, updated):
if "stream_large_bodies" in updated and ctx.options.stream_large_bodies:
try:
self.max_size = human.parse_size(ctx.options.stream_large_bodies)
except ValueError as e:
raise exceptions.OptionsError(e)
def configure(self, options, updated):
self.max_size = options.stream_large_bodies
def run(self, f, is_request):
if self.max_size:

View File

@@ -0,0 +1,68 @@
import os.path
from mitmproxy import exceptions
from mitmproxy import flowfilter
from mitmproxy import io
class StreamFile:
def __init__(self):
self.stream = None
self.filt = None
self.active_flows = set() # type: Set[flow.Flow]
def start_stream_to_path(self, path, mode, flt):
path = os.path.expanduser(path)
try:
f = open(path, mode)
except IOError as v:
raise exceptions.OptionsError(str(v))
self.stream = io.FilteredFlowWriter(f, flt)
self.active_flows = set()
def configure(self, options, updated):
# We're already streaming - stop the previous stream and restart
if "filtstr" in updated:
if options.filtstr:
self.filt = flowfilter.parse(options.filtstr)
if not self.filt:
raise exceptions.OptionsError(
"Invalid filter specification: %s" % options.filtstr
)
else:
self.filt = None
if "streamfile" in updated:
if self.stream:
self.done()
if options.streamfile:
if options.streamfile_append:
mode = "ab"
else:
mode = "wb"
self.start_stream_to_path(options.streamfile, mode, self.filt)
def tcp_start(self, flow):
if self.stream:
self.active_flows.add(flow)
def tcp_end(self, flow):
if self.stream:
self.stream.add(flow)
self.active_flows.discard(flow)
def response(self, flow):
if self.stream:
self.stream.add(flow)
self.active_flows.discard(flow)
def request(self, flow):
if self.stream:
self.active_flows.add(flow)
def done(self):
if self.stream:
for flow in self.active_flows:
self.stream.add(flow)
self.active_flows = set([])
self.stream.fo.close()
self.stream = None

View File

@@ -2,25 +2,23 @@ import sys
import click
from mitmproxy import log
from mitmproxy import ctx
# These get over-ridden by the save execution context. Keep them around so we
# can log directly.
realstdout = sys.stdout
realstderr = sys.stderr
class TermLog:
def __init__(self, outfile=None):
self.options = None
self.outfile = outfile
def configure(self, options, updated):
self.options = options
def log(self, e):
if log.log_tier(e.level) == log.log_tier("error"):
outfile = self.outfile or realstderr
outfile = self.outfile or sys.stderr
else:
outfile = self.outfile or realstdout
outfile = self.outfile or sys.stdout
if ctx.options.verbosity >= log.log_tier(e.level):
if self.options.verbosity >= log.log_tier(e.level):
click.secho(
e.msg,
file=outfile,

View File

@@ -1,17 +0,0 @@
from mitmproxy import ctx
from mitmproxy.utils import human
"""
A tiny addon to print the proxy status to terminal. Eventually this could
also print some stats on exit.
"""
class TermStatus:
def running(self):
if ctx.options.server:
ctx.log.info(
"Proxy server listening at http://{}".format(
human.format_address(ctx.master.server.address)
)
)

View File

@@ -2,7 +2,6 @@ import re
import base64
from mitmproxy import exceptions
from mitmproxy import ctx
from mitmproxy.utils import strutils
@@ -27,17 +26,20 @@ class UpstreamAuth():
"""
def __init__(self):
self.auth = None
self.root_mode = None
def configure(self, updated):
def configure(self, options, updated):
# FIXME: We're doing this because our proxy core is terminally confused
# at the moment. Ideally, we should be able to check if we're in
# reverse proxy mode at the HTTP layer, so that scripts can put the
# proxy in reverse proxy mode for specific reuests.
if "mode" in updated:
self.root_mode = options.mode
if "upstream_auth" in updated:
if ctx.options.upstream_auth is None:
if options.upstream_auth is None:
self.auth = None
else:
self.auth = parse_upstream_auth(ctx.options.upstream_auth)
self.auth = parse_upstream_auth(options.upstream_auth)
def http_connect(self, f):
if self.auth and f.mode == "upstream":
@@ -47,5 +49,5 @@ class UpstreamAuth():
if self.auth:
if f.mode == "upstream" and not f.server_conn.via:
f.request.headers["Proxy-Authorization"] = self.auth
elif ctx.options.mode == "reverse":
elif self.root_mode == "reverse":
f.request.headers["Proxy-Authorization"] = self.auth

View File

@@ -18,11 +18,6 @@ import sortedcontainers
import mitmproxy.flow
from mitmproxy import flowfilter
from mitmproxy import exceptions
from mitmproxy import command
from mitmproxy import connections
from mitmproxy import ctx
from mitmproxy import io
from mitmproxy import http # noqa
# The underlying sorted list implementation expects the sort key to be stable
# for the lifetime of the object. However, if we sort by size, for instance,
@@ -39,7 +34,7 @@ class _OrderKey:
def __init__(self, view):
self.view = view
def generate(self, f: http.HTTPFlow) -> typing.Any: # pragma: no cover
def generate(self, f: mitmproxy.flow.Flow) -> typing.Any: # pragma: no cover
pass
def refresh(self, f):
@@ -69,22 +64,22 @@ class _OrderKey:
class OrderRequestStart(_OrderKey):
def generate(self, f: http.HTTPFlow) -> datetime.datetime:
def generate(self, f: mitmproxy.flow.Flow) -> datetime.datetime:
return f.request.timestamp_start or 0
class OrderRequestMethod(_OrderKey):
def generate(self, f: http.HTTPFlow) -> str:
def generate(self, f: mitmproxy.flow.Flow) -> str:
return f.request.method
class OrderRequestURL(_OrderKey):
def generate(self, f: http.HTTPFlow) -> str:
def generate(self, f: mitmproxy.flow.Flow) -> str:
return f.request.url
class OrderKeySize(_OrderKey):
def generate(self, f: http.HTTPFlow) -> int:
def generate(self, f: mitmproxy.flow.Flow) -> int:
s = 0
if f.request.raw_content:
s += len(f.request.raw_content)
@@ -114,16 +109,16 @@ class View(collections.Sequence):
self.default_order = OrderRequestStart(self)
self.orders = dict(
time = OrderRequestStart(self), method = OrderRequestMethod(self),
url = OrderRequestURL(self), size = OrderKeySize(self),
time = self.default_order,
method = OrderRequestMethod(self),
url = OrderRequestURL(self),
size = OrderKeySize(self),
)
self.order_key = self.default_order
self.order_reversed = False
self.focus_follow = False
self._view = sortedcontainers.SortedListWithKey(
key = self.order_key
)
self._view = sortedcontainers.SortedListWithKey(key = self.order_key)
# The sig_view* signals broadcast events that affect the view. That is,
# an update to a flow in the store but not in the view does not trigger
@@ -170,7 +165,7 @@ class View(collections.Sequence):
def __len__(self):
return len(self._view)
def __getitem__(self, offset) -> typing.Any:
def __getitem__(self, offset) -> mitmproxy.flow.Flow:
return self._view[self._rev(offset)]
# Reflect some methods to the efficient underlying implementation
@@ -182,7 +177,7 @@ class View(collections.Sequence):
def index(self, f: mitmproxy.flow.Flow, start: int = 0, stop: typing.Optional[int] = None) -> int:
return self._rev(self._view.index(f, start, stop))
def __contains__(self, f: typing.Any) -> bool:
def __contains__(self, f: mitmproxy.flow.Flow) -> bool:
return self._view.__contains__(f)
def _order_key_name(self):
@@ -202,36 +197,7 @@ class View(collections.Sequence):
self.sig_view_refresh.send(self)
# API
@command.command("view.focus.next")
def focus_next(self) -> None:
"""
Set focus to the next flow.
"""
idx = self.focus.index + 1
if self.inbounds(idx):
self.focus.flow = self[idx]
@command.command("view.focus.prev")
def focus_prev(self) -> None:
"""
Set focus to the previous flow.
"""
idx = self.focus.index - 1
if self.inbounds(idx):
self.focus.flow = self[idx]
@command.command("view.order.options")
def order_options(self) -> typing.Sequence[str]:
"""
A list of all the orders we support.
"""
return list(sorted(self.orders.keys()))
@command.command("view.marked.toggle")
def toggle_marked(self) -> None:
"""
Toggle whether to show marked views only.
"""
def toggle_marked(self):
self.show_marked = not self.show_marked
self._refilter()
@@ -255,7 +221,7 @@ class View(collections.Sequence):
self.filter = flt or matchall
self._refilter()
def clear(self) -> None:
def clear(self):
"""
Clears both the store and view.
"""
@@ -275,19 +241,55 @@ class View(collections.Sequence):
self._refilter()
self.sig_store_refresh.send(self)
def add(self, flows: typing.Sequence[mitmproxy.flow.Flow]) -> None:
def add(self, f: mitmproxy.flow.Flow) -> bool:
"""
Adds a flow to the state. If the flow already exists, it is
ignored.
"""
for f in flows:
if f.id not in self._store:
self._store[f.id] = f
if self.filter(f):
if f.id not in self._store:
self._store[f.id] = f
if self.filter(f):
self._base_add(f)
if self.focus_follow:
self.focus.flow = f
self.sig_view_add.send(self, flow=f)
def remove(self, f: mitmproxy.flow.Flow):
"""
Removes the flow from the underlying store and the view.
"""
if f.id in self._store:
if f in self._view:
self._view.remove(f)
self.sig_view_remove.send(self, flow=f)
del self._store[f.id]
self.sig_store_remove.send(self, flow=f)
def update(self, f: mitmproxy.flow.Flow):
"""
Updates a flow. If the flow is not in the state, it's ignored.
"""
if f.id in self._store:
if self.filter(f):
if f not in self._view:
self._base_add(f)
if self.focus_follow:
self.focus.flow = f
self.sig_view_add.send(self, flow=f)
else:
# This is a tad complicated. The sortedcontainers
# implementation assumes that the order key is stable. If
# it changes mid-way Very Bad Things happen. We detect when
# this happens, and re-fresh the item.
self.order_key.refresh(f)
self.sig_view_update.send(self, flow=f)
else:
try:
self._view.remove(f)
self.sig_view_remove.send(self, flow=f)
except ValueError:
# The value was not in the view
pass
def get_by_id(self, flow_id: str) -> typing.Optional[mitmproxy.flow.Flow]:
"""
@@ -296,199 +298,48 @@ class View(collections.Sequence):
"""
return self._store.get(flow_id)
@command.command("view.getval")
def getvalue(self, f: mitmproxy.flow.Flow, key: str, default: str) -> str:
"""
Get a value from the settings store for the specified flow.
"""
return self.settings[f].get(key, default)
@command.command("view.setval.toggle")
def setvalue_toggle(
self,
flows: typing.Sequence[mitmproxy.flow.Flow],
key: str
) -> None:
"""
Toggle a boolean value in the settings store, seting the value to
the string "true" or "false".
"""
updated = []
for f in flows:
current = self.settings[f].get("key", "false")
self.settings[f][key] = "false" if current == "true" else "true"
updated.append(f)
ctx.master.addons.trigger("update", updated)
@command.command("view.setval")
def setvalue(
self,
flows: typing.Sequence[mitmproxy.flow.Flow],
key: str, value: str
) -> None:
"""
Set a value in the settings store for the specified flows.
"""
updated = []
for f in flows:
self.settings[f][key] = value
updated.append(f)
ctx.master.addons.trigger("update", updated)
@command.command("view.load")
def load_file(self, path: str) -> None:
"""
Load flows into the view, without processing them with addons.
"""
for i in io.FlowReader(open(path, "rb")).stream():
# Do this to get a new ID, so we can load the same file N times and
# get new flows each time. It would be more efficient to just have a
# .newid() method or something.
self.add([i.copy()])
@command.command("view.go")
def go(self, dst: int) -> None:
"""
Go to a specified offset. Positive offests are from the beginning of
the view, negative from the end of the view, so that 0 is the first
flow, -1 is the last flow.
"""
if len(self) == 0:
return
if dst < 0:
dst = len(self) + dst
if dst < 0:
dst = 0
if dst > len(self) - 1:
dst = len(self) - 1
self.focus.flow = self[dst]
@command.command("view.duplicate")
def duplicate(self, flows: typing.Sequence[mitmproxy.flow.Flow]) -> None:
"""
Duplicates the specified flows, and sets the focus to the first
duplicate.
"""
dups = [f.copy() for f in flows]
if dups:
self.add(dups)
self.focus.flow = dups[0]
ctx.log.alert("Duplicated %s flows" % len(dups))
@command.command("view.remove")
def remove(self, flows: typing.Sequence[mitmproxy.flow.Flow]) -> None:
"""
Removes the flow from the underlying store and the view.
"""
for f in flows:
if f.id in self._store:
if f.killable:
f.kill()
if f in self._view:
self._view.remove(f)
self.sig_view_remove.send(self, flow=f)
del self._store[f.id]
self.sig_store_remove.send(self, flow=f)
@command.command("view.resolve")
def resolve(self, spec: str) -> typing.Sequence[mitmproxy.flow.Flow]:
"""
Resolve a flow list specification to an actual list of flows.
"""
if spec == "@all":
return [i for i in self._store.values()]
if spec == "@focus":
return [self.focus.flow] if self.focus.flow else []
elif spec == "@shown":
return [i for i in self]
elif spec == "@hidden":
return [i for i in self._store.values() if i not in self._view]
elif spec == "@marked":
return [i for i in self._store.values() if i.marked]
elif spec == "@unmarked":
return [i for i in self._store.values() if not i.marked]
else:
filt = flowfilter.parse(spec)
if not filt:
raise exceptions.CommandError("Invalid flow filter: %s" % spec)
return [i for i in self._store.values() if filt(i)]
@command.command("view.create")
def create(self, method: str, url: str) -> None:
req = http.HTTPRequest.make(method.upper(), url)
c = connections.ClientConnection.make_dummy(("", 0))
s = connections.ServerConnection.make_dummy((req.host, req.port))
f = http.HTTPFlow(c, s)
f.request = req
f.request.headers["Host"] = req.host
self.add([f])
# Event handlers
def configure(self, updated):
if "view_filter" in updated:
def configure(self, opts, updated):
if "filter" in updated:
filt = None
if ctx.options.view_filter:
filt = flowfilter.parse(ctx.options.view_filter)
if opts.filter:
filt = flowfilter.parse(opts.filter)
if not filt:
raise exceptions.OptionsError(
"Invalid interception filter: %s" % ctx.options.view_filter
"Invalid interception filter: %s" % opts.filter
)
self.set_filter(filt)
if "console_order" in updated:
if ctx.options.console_order not in self.orders:
raise exceptions.OptionsError(
"Unknown flow order: %s" % ctx.options.console_order
)
self.set_order(self.orders[ctx.options.console_order])
if opts.console_order is None:
self.set_order(self.default_order)
else:
if opts.console_order not in self.orders:
raise exceptions.OptionsError(
"Unknown flow order: %s" % opts.console_order
)
self.set_order(self.orders[opts.console_order])
if "console_order_reversed" in updated:
self.set_reversed(ctx.options.console_order_reversed)
self.set_reversed(opts.console_order_reversed)
if "console_focus_follow" in updated:
self.focus_follow = ctx.options.console_focus_follow
self.focus_follow = opts.console_focus_follow
def request(self, f):
self.add([f])
self.add(f)
def error(self, f):
self.update([f])
self.update(f)
def response(self, f):
self.update([f])
self.update(f)
def intercept(self, f):
self.update([f])
self.update(f)
def resume(self, f):
self.update([f])
self.update(f)
def kill(self, f):
self.update([f])
def update(self, flows: typing.Sequence[mitmproxy.flow.Flow]) -> None:
"""
Updates a list of flows. If flow is not in the state, it's ignored.
"""
for f in flows:
if f.id in self._store:
if self.filter(f):
if f not in self._view:
self._base_add(f)
if self.focus_follow:
self.focus.flow = f
self.sig_view_add.send(self, flow=f)
else:
# This is a tad complicated. The sortedcontainers
# implementation assumes that the order key is stable. If
# it changes mid-way Very Bad Things happen. We detect when
# this happens, and re-fresh the item.
self.order_key.refresh(f)
self.sig_view_update.send(self, flow=f)
else:
try:
self._view.remove(f)
self.sig_view_remove.send(self, flow=f)
except ValueError:
# The value was not in the view
pass
self.update(f)
class Focus:
@@ -520,7 +371,6 @@ class Focus:
def index(self) -> typing.Optional[int]:
if self.flow:
return self.view.index(self.flow)
return None
@index.setter
def index(self, idx):
@@ -554,7 +404,7 @@ class Focus:
class Settings(collections.Mapping):
def __init__(self, view: View) -> None:
self.view = view
self._values = {} # type: typing.MutableMapping[str, typing.Dict]
self._values = {} # type: typing.MutableMapping[str, mitmproxy.flow.Flow]
view.sig_store_remove.connect(self._sig_store_remove)
view.sig_store_refresh.connect(self._sig_store_refresh)

View File

@@ -13,10 +13,6 @@ class WSGIApp:
def __init__(self, app, host, port):
self.app, self.host, self.port = app, host, port
@property
def name(self):
return "wsgiapp:%s:%s" % (self.host, self.port)
def serve(self, app, flow):
"""
Serves app on flow, and prevents further handling of the flow.

View File

@@ -93,9 +93,9 @@ def dummy_cert(privkey, cacert, commonname, sans):
try:
ipaddress.ip_address(i.decode("ascii"))
except ValueError:
ss.append(b"DNS:%s" % i)
ss.append(b"DNS: %s" % i)
else:
ss.append(b"IP:%s" % i)
ss.append(b"IP: %s" % i)
ss = b", ".join(ss)
cert = OpenSSL.crypto.X509()
@@ -356,14 +356,14 @@ class CertStore:
class _GeneralName(univ.Choice):
# We only care about dNSName and iPAddress
# We are only interested in dNSNames. We use a default handler to ignore
# other types.
# TODO: We should also handle iPAddresses.
componentType = namedtype.NamedTypes(
namedtype.NamedType('dNSName', char.IA5String().subtype(
implicitTag=tag.Tag(tag.tagClassContext, tag.tagFormatSimple, 2)
)),
namedtype.NamedType('iPAddress', univ.OctetString().subtype(
implicitTag=tag.Tag(tag.tagClassContext, tag.tagFormatSimple, 7)
)),
)
),
)
@@ -384,6 +384,9 @@ class SSLCert(serializable.Serializable):
def __eq__(self, other):
return self.digest("sha256") == other.digest("sha256")
def __ne__(self, other):
return not self.__eq__(other)
def get_state(self):
return self.to_pem()
@@ -474,10 +477,5 @@ class SSLCert(serializable.Serializable):
except PyAsn1Error:
continue
for i in dec[0]:
if i[0] is None and isinstance(i[1], univ.OctetString) and not isinstance(i[1], char.IA5String):
# This would give back the IP address: b'.'.join([str(e).encode() for e in i[1].asNumbers()])
continue
else:
e = i[0].asOctets()
altnames.append(e)
altnames.append(i[0].asOctets())
return altnames

View File

@@ -1,195 +0,0 @@
"""
This module manges and invokes typed commands.
"""
import inspect
import typing
import shlex
import textwrap
import functools
import sys
from mitmproxy.utils import typecheck
from mitmproxy import exceptions
from mitmproxy import flow
Cuts = typing.Sequence[
typing.Sequence[typing.Union[str, bytes]]
]
def typename(t: type, ret: bool) -> str:
"""
Translates a type to an explanatory string. If ret is True, we're
looking at a return type, else we're looking at a parameter type.
"""
if issubclass(t, (str, int, bool)):
return t.__name__
elif t == typing.Sequence[flow.Flow]:
return "[flow]" if ret else "flowspec"
elif t == typing.Sequence[str]:
return "[str]"
elif t == Cuts:
return "[cuts]" if ret else "cutspec"
elif t == flow.Flow:
return "flow"
else: # pragma: no cover
raise NotImplementedError(t)
class Command:
def __init__(self, manager, path, func) -> None:
self.path = path
self.manager = manager
self.func = func
sig = inspect.signature(self.func)
self.help = None
if func.__doc__:
txt = func.__doc__.strip()
self.help = "\n".join(textwrap.wrap(txt))
self.has_positional = False
for i in sig.parameters.values():
# This is the kind for *args paramters
if i.kind == i.VAR_POSITIONAL:
self.has_positional = True
self.paramtypes = [v.annotation for v in sig.parameters.values()]
self.returntype = sig.return_annotation
def paramnames(self) -> typing.Sequence[str]:
v = [typename(i, False) for i in self.paramtypes]
if self.has_positional:
v[-1] = "*" + v[-1][1:-1]
return v
def retname(self) -> str:
return typename(self.returntype, True) if self.returntype else ""
def signature_help(self) -> str:
params = " ".join(self.paramnames())
ret = self.retname()
if ret:
ret = " -> " + ret
return "%s %s%s" % (self.path, params, ret)
def call(self, args: typing.Sequence[str]):
"""
Call the command with a set of arguments. At this point, all argumets are strings.
"""
if not self.has_positional and (len(self.paramtypes) != len(args)):
raise exceptions.CommandError("Usage: %s" % self.signature_help())
remainder = [] # type: typing.Sequence[str]
if self.has_positional:
remainder = args[len(self.paramtypes) - 1:]
args = args[:len(self.paramtypes) - 1]
pargs = []
for i in range(len(args)):
if typecheck.check_command_type(args[i], self.paramtypes[i]):
pargs.append(args[i])
else:
pargs.append(parsearg(self.manager, args[i], self.paramtypes[i]))
if remainder:
if typecheck.check_command_type(remainder, self.paramtypes[-1]):
pargs.extend(remainder)
else:
raise exceptions.CommandError("Invalid value type.")
with self.manager.master.handlecontext():
ret = self.func(*pargs)
if not typecheck.check_command_type(ret, self.returntype):
raise exceptions.CommandError("Command returned unexpected data")
return ret
class CommandManager:
def __init__(self, master):
self.master = master
self.commands = {}
def collect_commands(self, addon):
for i in dir(addon):
if not i.startswith("__"):
o = getattr(addon, i)
if hasattr(o, "command_path"):
self.add(o.command_path, o)
def add(self, path: str, func: typing.Callable):
self.commands[path] = Command(self, path, func)
def call_args(self, path, args):
"""
Call a command using a list of string arguments. May raise CommandError.
"""
if path not in self.commands:
raise exceptions.CommandError("Unknown command: %s" % path)
return self.commands[path].call(args)
def call(self, cmdstr: str):
"""
Call a command using a string. May raise CommandError.
"""
parts = shlex.split(cmdstr)
if not len(parts) >= 1:
raise exceptions.CommandError("Invalid command: %s" % cmdstr)
return self.call_args(parts[0], parts[1:])
def dump(self, out=sys.stdout) -> None:
cmds = list(self.commands.values())
cmds.sort(key=lambda x: x.signature_help())
for c in cmds:
for hl in (c.help or "").splitlines():
print("# " + hl, file=out)
print(c.signature_help(), file=out)
print(file=out)
def parsearg(manager: CommandManager, spec: str, argtype: type) -> typing.Any:
"""
Convert a string to a argument to the appropriate type.
"""
if issubclass(argtype, str):
return spec
elif argtype == bool:
if spec == "true":
return True
elif spec == "false":
return False
else:
raise exceptions.CommandError(
"Booleans are 'true' or 'false', got %s" % spec
)
elif issubclass(argtype, int):
try:
return int(spec)
except ValueError as e:
raise exceptions.CommandError("Expected an integer, got %s." % spec)
elif argtype == typing.Sequence[flow.Flow]:
return manager.call_args("view.resolve", [spec])
elif argtype == Cuts:
return manager.call_args("cut", [spec])
elif argtype == flow.Flow:
flows = manager.call_args("view.resolve", [spec])
if len(flows) != 1:
raise exceptions.CommandError(
"Command requires one flow, specification matched %s." % len(flows)
)
return flows[0]
elif argtype == typing.Sequence[str]:
return [i.strip() for i in spec.split(",")]
else:
raise exceptions.CommandError("Unsupported argument type: %s" % argtype)
def command(path):
def decorator(function):
@functools.wraps(function)
def wrapper(*args, **kwargs):
return function(*args, **kwargs)
wrapper.__dict__["command_path"] = path
return wrapper
return decorator

View File

@@ -1,7 +1,6 @@
import time
import os
import uuid
from mitmproxy import stateobject
from mitmproxy import certs
@@ -18,7 +17,6 @@ class ClientConnection(tcp.BaseHandler, stateobject.StateObject):
address: Remote address
ssl_established: True if TLS is established, False otherwise
clientcert: The TLS client certificate
mitmcert: The MITM'ed TLS server certificate presented to the client
timestamp_start: Connection start timestamp
timestamp_ssl_setup: TLS established timestamp
timestamp_end: Connection end timestamp
@@ -42,8 +40,6 @@ class ClientConnection(tcp.BaseHandler, stateobject.StateObject):
self.clientcert = None
self.ssl_established = None
self.id = str(uuid.uuid4())
self.mitmcert = None
self.timestamp_start = time.time()
self.timestamp_end = None
self.timestamp_ssl_setup = None
@@ -56,47 +52,26 @@ class ClientConnection(tcp.BaseHandler, stateobject.StateObject):
return bool(self.connection) and not self.finished
def __repr__(self):
if self.ssl_established:
tls = "[{}] ".format(self.tls_version)
else:
tls = ""
if self.alpn_proto_negotiated:
alpn = "[ALPN: {}] ".format(
strutils.bytes_to_escaped_str(self.alpn_proto_negotiated)
)
else:
alpn = ""
return "<ClientConnection: {tls}{alpn}{host}:{port}>".format(
tls=tls,
return "<ClientConnection: {ssl}{alpn}{address}>".format(
ssl="[ssl] " if self.ssl_established else "",
alpn=alpn,
host=self.address[0],
port=self.address[1],
address=repr(self.address)
)
def __eq__(self, other):
if isinstance(other, ClientConnection):
return self.id == other.id
return False
def __hash__(self):
return hash(self.id)
@property
def tls_established(self):
return self.ssl_established
@tls_established.setter
def tls_established(self, value):
self.ssl_established = value
_stateobject_attributes = dict(
id=str,
address=tuple,
address=tcp.Address,
ssl_established=bool,
clientcert=certs.SSLCert,
mitmcert=certs.SSLCert,
timestamp_start=float,
timestamp_ssl_setup=float,
timestamp_end=float,
@@ -121,10 +96,8 @@ class ClientConnection(tcp.BaseHandler, stateobject.StateObject):
@classmethod
def make_dummy(cls, address):
return cls.from_state(dict(
id=str(uuid.uuid4()),
address=address,
address=dict(address=address, use_ipv6=False),
clientcert=None,
mitmcert=None,
ssl_established=False,
timestamp_start=None,
timestamp_end=None,
@@ -135,10 +108,9 @@ class ClientConnection(tcp.BaseHandler, stateobject.StateObject):
tls_version=None,
))
def convert_to_ssl(self, cert, *args, **kwargs):
super().convert_to_ssl(cert, *args, **kwargs)
def convert_to_ssl(self, *args, **kwargs):
super().convert_to_ssl(*args, **kwargs)
self.timestamp_ssl_setup = time.time()
self.mitmcert = cert
sni = self.connection.get_servername()
if sni:
self.sni = sni.decode("idna")
@@ -166,7 +138,6 @@ class ServerConnection(tcp.TCPClient, stateobject.StateObject):
cert: The certificate presented by the remote during the TLS handshake
sni: Server Name Indication sent by the proxy during the TLS handshake
alpn_proto_negotiated: The negotiated application protocol
tls_version: TLS version
via: The underlying server connection (e.g. the connection to the upstream proxy in upstream proxy mode)
timestamp_start: Connection start timestamp
timestamp_tcp_setup: TCP ACK received timestamp
@@ -177,9 +148,7 @@ class ServerConnection(tcp.TCPClient, stateobject.StateObject):
def __init__(self, address, source_address=None, spoof_source_address=None):
tcp.TCPClient.__init__(self, address, source_address, spoof_source_address)
self.id = str(uuid.uuid4())
self.alpn_proto_negotiated = None
self.tls_version = None
self.via = None
self.timestamp_start = None
self.timestamp_end = None
@@ -191,50 +160,35 @@ class ServerConnection(tcp.TCPClient, stateobject.StateObject):
def __repr__(self):
if self.ssl_established and self.sni:
tls = "[{}: {}] ".format(self.tls_version or "TLS", self.sni)
ssl = "[ssl: {0}] ".format(self.sni)
elif self.ssl_established:
tls = "[{}] ".format(self.tls_version or "TLS")
ssl = "[ssl] "
else:
tls = ""
ssl = ""
if self.alpn_proto_negotiated:
alpn = "[ALPN: {}] ".format(
strutils.bytes_to_escaped_str(self.alpn_proto_negotiated)
)
else:
alpn = ""
return "<ServerConnection: {tls}{alpn}{host}:{port}>".format(
tls=tls,
return "<ServerConnection: {ssl}{alpn}{address}>".format(
ssl=ssl,
alpn=alpn,
host=self.address[0],
port=self.address[1],
address=repr(self.address)
)
def __eq__(self, other):
if isinstance(other, ServerConnection):
return self.id == other.id
return False
def __hash__(self):
return hash(self.id)
@property
def tls_established(self):
return self.ssl_established
@tls_established.setter
def tls_established(self, value):
self.ssl_established = value
_stateobject_attributes = dict(
id=str,
address=tuple,
ip_address=tuple,
source_address=tuple,
address=tcp.Address,
ip_address=tcp.Address,
source_address=tcp.Address,
ssl_established=bool,
cert=certs.SSLCert,
sni=str,
alpn_proto_negotiated=bytes,
tls_version=str,
timestamp_start=float,
timestamp_tcp_setup=float,
timestamp_ssl_setup=float,
@@ -250,14 +204,12 @@ class ServerConnection(tcp.TCPClient, stateobject.StateObject):
@classmethod
def make_dummy(cls, address):
return cls.from_state(dict(
id=str(uuid.uuid4()),
address=address,
ip_address=address,
address=dict(address=address, use_ipv6=False),
ip_address=dict(address=address, use_ipv6=False),
cert=None,
sni=None,
alpn_proto_negotiated=None,
tls_version=None,
source_address=('', 0),
source_address=dict(address=('', 0), use_ipv6=False),
ssl_established=False,
timestamp_start=None,
timestamp_tcp_setup=None,
@@ -287,14 +239,13 @@ class ServerConnection(tcp.TCPClient, stateobject.StateObject):
else:
path = os.path.join(
clientcerts,
self.address[0].encode("idna").decode()) + ".pem"
self.address.host.encode("idna").decode()) + ".pem"
if os.path.exists(path):
clientcert = path
self.convert_to_ssl(cert=clientcert, sni=sni, **kwargs)
self.sni = sni
self.alpn_proto_negotiated = self.get_alpn_proto_negotiated()
self.tls_version = self.connection.get_protocol_version_name()
self.timestamp_ssl_setup = time.time()
def finish(self):

View File

@@ -36,14 +36,12 @@ def get(name: str) -> Optional[View]:
for i in views:
if i.name.lower() == name.lower():
return i
return None
def get_by_shortcut(c: str) -> Optional[View]:
for i in views:
if i.prompt[1] == c:
return i
return None
def add(view: View) -> None:

View File

@@ -37,7 +37,7 @@ def parse_gif(data: bytes) -> Metadata:
descriptor = img.logical_screen_descriptor
parts = [
('Format', 'Compuserve GIF'),
('Version', "GIF{}".format(img.hdr.version)),
('Version', "GIF{}".format(img.header.version.decode('ASCII'))),
('Size', "{} x {} px".format(descriptor.screen_width, descriptor.screen_height)),
('background', str(descriptor.bg_color_index))
]

View File

@@ -128,7 +128,6 @@ def is_inline_text(a: Token, b: Token, c: Token) -> bool:
if isinstance(a, Tag) and isinstance(b, Text) and isinstance(c, Tag):
if a.is_opening and "\n" not in b.data and c.is_closing and a.tag == c.tag:
return True
return False
def is_inline(prev2: Token, prev1: Token, t: Token, next1: Token, next2: Token) -> bool:
@@ -141,7 +140,6 @@ def is_inline(prev2: Token, prev1: Token, t: Token, next1: Token, next2: Token)
return True # <div></div> (start tag)
if isinstance(prev1, Tag) and prev1.is_opening and t.is_closing and prev1.tag == t.tag:
return True # <div></div> (end tag)
return False
class ElementStack:

View File

@@ -1,20 +1,16 @@
# This is a generated file! Please edit source .ksy file and use kaitai-struct-compiler to rebuild
# The source was exif.ksy from here - https://github.com/kaitai-io/kaitai_struct_formats/blob/24e2d00048b8084ceec30a187a79cb87a79a48ba/image/exif.ksy
import array
import struct
import zlib
from enum import Enum
from pkg_resources import parse_version
from kaitaistruct import __version__ as ks_version, KaitaiStruct, KaitaiStream, BytesIO
from kaitaistruct import KaitaiStruct, KaitaiStream, BytesIO
if parse_version(ks_version) < parse_version('0.7'):
raise Exception("Incompatible Kaitai Struct Python API: 0.7 or later is required, but you have %s" % (ks_version))
from .exif_le import ExifLe
from .exif_be import ExifBe
class Exif(KaitaiStruct):
def __init__(self, _io, _parent=None, _root=None):
self._io = _io

View File

@@ -1,17 +1,14 @@
# This is a generated file! Please edit source .ksy file and use kaitai-struct-compiler to rebuild
# The source was exif_be.ksy from here - https://github.com/kaitai-io/kaitai_struct_formats/blob/24e2d00048b8084ceec30a187a79cb87a79a48ba/image/exif_be.ksy
import array
import struct
import zlib
from enum import Enum
from pkg_resources import parse_version
from kaitaistruct import __version__ as ks_version, KaitaiStruct, KaitaiStream, BytesIO
from kaitaistruct import KaitaiStruct, KaitaiStream, BytesIO
if parse_version(ks_version) < parse_version('0.7'):
raise Exception("Incompatible Kaitai Struct Python API: 0.7 or later is required, but you have %s" % (ks_version))
class ExifBe(KaitaiStruct):
def __init__(self, _io, _parent=None, _root=None):
self._io = _io
@@ -572,5 +569,3 @@ class ExifBe(KaitaiStruct):
self._m_ifd0 = self._root.Ifd(self._io, self, self._root)
self._io.seek(_pos)
return self._m_ifd0 if hasattr(self, '_m_ifd0') else None

View File

@@ -1,17 +1,14 @@
# This is a generated file! Please edit source .ksy file and use kaitai-struct-compiler to rebuild
# The source was exif_le.ksy from here - https://github.com/kaitai-io/kaitai_struct_formats/blob/24e2d00048b8084ceec30a187a79cb87a79a48ba/image/exif_le.ksy
import array
import struct
import zlib
from enum import Enum
from pkg_resources import parse_version
from kaitaistruct import __version__ as ks_version, KaitaiStruct, KaitaiStream, BytesIO
from kaitaistruct import KaitaiStruct, KaitaiStream, BytesIO
if parse_version(ks_version) < parse_version('0.7'):
raise Exception("Incompatible Kaitai Struct Python API: 0.7 or later is required, but you have %s" % (ks_version))
class ExifLe(KaitaiStruct):
def __init__(self, _io, _parent=None, _root=None):
self._io = _io
@@ -572,5 +569,3 @@ class ExifLe(KaitaiStruct):
self._m_ifd0 = self._root.Ifd(self._io, self, self._root)
self._io.seek(_pos)
return self._m_ifd0 if hasattr(self, '_m_ifd0') else None

View File

@@ -1,17 +1,14 @@
# This is a generated file! Please edit source .ksy file and use kaitai-struct-compiler to rebuild
# The source was png.ksy from here - https://github.com/kaitai-io/kaitai_struct_formats/blob/562154250bea0081fed4e232751b934bc270a0c7/image/gif.ksy
import array
import struct
import zlib
from enum import Enum
from pkg_resources import parse_version
from kaitaistruct import __version__ as ks_version, KaitaiStruct, KaitaiStream, BytesIO
from kaitaistruct import KaitaiStruct, KaitaiStream, BytesIO
if parse_version(ks_version) < parse_version('0.7'):
raise Exception("Incompatible Kaitai Struct Python API: 0.7 or later is required, but you have %s" % (ks_version))
class Gif(KaitaiStruct):
class BlockType(Enum):
@@ -27,8 +24,8 @@ class Gif(KaitaiStruct):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self.hdr = self._root.Header(self._io, self, self._root)
self.logical_screen_descriptor = self._root.LogicalScreenDescriptorStruct(self._io, self, self._root)
self.header = self._root.Header(self._io, self, self._root)
self.logical_screen_descriptor = self._root.LogicalScreenDescriptor(self._io, self, self._root)
if self.logical_screen_descriptor.has_color_table:
self._raw_global_color_table = self._io.read_bytes((self.logical_screen_descriptor.color_table_size * 3))
io = KaitaiStream(BytesIO(self._raw_global_color_table))
@@ -58,7 +55,7 @@ class Gif(KaitaiStruct):
self.blue = self._io.read_u1()
class LogicalScreenDescriptorStruct(KaitaiStruct):
class LogicalScreenDescriptor(KaitaiStruct):
def __init__(self, _io, _parent=None, _root=None):
self._io = _io
self._parent = _parent
@@ -166,7 +163,7 @@ class Gif(KaitaiStruct):
self._parent = _parent
self._root = _root if _root else self
self.magic = self._io.ensure_fixed_contents(struct.pack('3b', 71, 73, 70))
self.version = (self._io.read_bytes(3)).decode(u"ASCII")
self.version = self._io.read_bytes(3)
class ExtGraphicControl(KaitaiStruct):
@@ -248,6 +245,3 @@ class Gif(KaitaiStruct):
self.body = self._root.ExtGraphicControl(self._io, self, self._root)
else:
self.body = self._root.Subblocks(self._io, self, self._root)

View File

@@ -1,19 +1,15 @@
# This is a generated file! Please edit source .ksy file and use kaitai-struct-compiler to rebuild
# The source was jpeg.ksy from here - https://github.com/kaitai-io/kaitai_struct_formats/blob/24e2d00048b8084ceec30a187a79cb87a79a48ba/image/jpeg.ksy
import array
import struct
import zlib
from enum import Enum
from pkg_resources import parse_version
from kaitaistruct import __version__ as ks_version, KaitaiStruct, KaitaiStream, BytesIO
from kaitaistruct import KaitaiStruct, KaitaiStream, BytesIO
if parse_version(ks_version) < parse_version('0.7'):
raise Exception("Incompatible Kaitai Struct Python API: 0.7 or later is required, but you have %s" % (ks_version))
from .exif import Exif
class Jpeg(KaitaiStruct):
class ComponentId(Enum):
@@ -131,7 +127,7 @@ class Jpeg(KaitaiStruct):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self.magic = (self._io.read_bytes_term(0, False, True, True)).decode(u"ASCII")
self.magic = self._io.read_strz("ASCII", 0, False, True, True)
_on = self.magic
if _on == u"Exif":
self.body = self._root.ExifInJpeg(self._io, self, self._root)
@@ -199,7 +195,7 @@ class Jpeg(KaitaiStruct):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self.magic = (self._io.read_bytes(5)).decode(u"ASCII")
self.magic = self._io.read_str_byte_limit(5, "ASCII")
self.version_major = self._io.read_u1()
self.version_minor = self._io.read_u1()
self.density_units = self._root.SegmentApp0.DensityUnit(self._io.read_u1())

View File

@@ -1,10 +0,0 @@
#!/usr/bin/env bash
wget -N https://raw.githubusercontent.com/kaitai-io/kaitai_struct_formats/master/image/exif_be.ksy
wget -N https://raw.githubusercontent.com/kaitai-io/kaitai_struct_formats/master/image/exif_le.ksy
wget -N https://raw.githubusercontent.com/kaitai-io/kaitai_struct_formats/master/image/exif.ksy
wget -N https://raw.githubusercontent.com/kaitai-io/kaitai_struct_formats/master/image/gif.ksy
wget -N https://raw.githubusercontent.com/kaitai-io/kaitai_struct_formats/master/image/jpeg.ksy
wget -N https://raw.githubusercontent.com/kaitai-io/kaitai_struct_formats/master/image/png.ksy
kaitai-struct-compiler --target python --opaque-types=true *.ksy

View File

@@ -1,17 +1,14 @@
# This is a generated file! Please edit source .ksy file and use kaitai-struct-compiler to rebuild
# The source was png.ksy from here - https://github.com/kaitai-io/kaitai_struct_formats/blob/9370c720b7d2ad329102d89bdc880ba6a706ef26/image/png.ksy
import array
import struct
import zlib
from enum import Enum
from pkg_resources import parse_version
from kaitaistruct import __version__ as ks_version, KaitaiStruct, KaitaiStream, BytesIO
from kaitaistruct import KaitaiStruct, KaitaiStream, BytesIO
if parse_version(ks_version) < parse_version('0.7'):
raise Exception("Incompatible Kaitai Struct Python API: 0.7 or later is required, but you have %s" % (ks_version))
class Png(KaitaiStruct):
class ColorType(Enum):
@@ -54,7 +51,7 @@ class Png(KaitaiStruct):
self._parent = _parent
self._root = _root if _root else self
self.len = self._io.read_u4be()
self.type = (self._io.read_bytes(4)).decode(u"UTF-8")
self.type = self._io.read_str_byte_limit(4, "UTF-8")
_on = self.type
if _on == u"iTXt":
self._raw_body = self._io.read_bytes(self.len)
@@ -197,7 +194,7 @@ class Png(KaitaiStruct):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self.keyword = (self._io.read_bytes_term(0, False, True, True)).decode(u"UTF-8")
self.keyword = self._io.read_strz("UTF-8", 0, False, True, True)
self.compression_method = self._io.read_u1()
self._raw_text_datastream = self._io.read_bytes_full()
self.text_datastream = zlib.decompress(self._raw_text_datastream)
@@ -262,12 +259,12 @@ class Png(KaitaiStruct):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self.keyword = (self._io.read_bytes_term(0, False, True, True)).decode(u"UTF-8")
self.keyword = self._io.read_strz("UTF-8", 0, False, True, True)
self.compression_flag = self._io.read_u1()
self.compression_method = self._io.read_u1()
self.language_tag = (self._io.read_bytes_term(0, False, True, True)).decode(u"ASCII")
self.translated_keyword = (self._io.read_bytes_term(0, False, True, True)).decode(u"UTF-8")
self.text = (self._io.read_bytes_full()).decode(u"UTF-8")
self.language_tag = self._io.read_strz("ASCII", 0, False, True, True)
self.translated_keyword = self._io.read_strz("UTF-8", 0, False, True, True)
self.text = self._io.read_str_eos("UTF-8")
class TextChunk(KaitaiStruct):
@@ -275,8 +272,8 @@ class Png(KaitaiStruct):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self.keyword = (self._io.read_bytes_term(0, False, True, True)).decode(u"iso8859-1")
self.text = (self._io.read_bytes_full()).decode(u"iso8859-1")
self.keyword = self._io.read_strz("iso8859-1", 0, False, True, True)
self.text = self._io.read_str_eos("iso8859-1")
class TimeChunk(KaitaiStruct):
@@ -290,6 +287,3 @@ class Png(KaitaiStruct):
self.hour = self._io.read_u1()
self.minute = self._io.read_u1()
self.second = self._io.read_u1()

View File

@@ -41,9 +41,9 @@ all other strings are returned as plain bytes.
"""
import collections
import typing
from typing import io, Union, Tuple
TSerializable = typing.Union[None, str, bool, int, float, bytes, list, tuple, dict]
TSerializable = Union[None, bool, int, float, bytes, list, tuple, dict]
def dumps(value: TSerializable) -> bytes:
@@ -53,12 +53,12 @@ def dumps(value: TSerializable) -> bytes:
# This uses a deque to collect output fragments in reverse order,
# then joins them together at the end. It's measurably faster
# than creating all the intermediate strings.
q = collections.deque() # type: collections.deque
q = collections.deque()
_rdumpq(q, 0, value)
return b''.join(q)
def dump(value: TSerializable, file_handle: typing.BinaryIO) -> None:
def dump(value: TSerializable, file_handle: io.BinaryIO) -> None:
"""
This function dumps a python object as a tnetstring and
writes it to the given file.
@@ -156,7 +156,7 @@ def loads(string: bytes) -> TSerializable:
return pop(string)[0]
def load(file_handle: typing.BinaryIO) -> TSerializable:
def load(file_handle: io.BinaryIO) -> TSerializable:
"""load(file) -> object
This function reads a tnetstring from a file and parses it into a
@@ -213,19 +213,19 @@ def parse(data_type: int, data: bytes) -> TSerializable:
l = []
while data:
item, data = pop(data)
l.append(item) # type: ignore
l.append(item)
return l
if data_type == ord(b'}'):
d = {}
while data:
key, data = pop(data)
val, data = pop(data)
d[key] = val # type: ignore
d[key] = val
return d
raise ValueError("unknown type tag: {}".format(data_type))
def pop(data: bytes) -> typing.Tuple[TSerializable, bytes]:
def pop(data: bytes) -> Tuple[TSerializable, bytes]:
"""
This function parses a tnetstring into a python object.
It returns a tuple giving the parsed object and a string
@@ -233,8 +233,8 @@ def pop(data: bytes) -> typing.Tuple[TSerializable, bytes]:
"""
# Parse out data length, type and remaining string.
try:
blength, data = data.split(b':', 1)
length = int(blength)
length, data = data.split(b':', 1)
length = int(length)
except ValueError:
raise ValueError("not a tnetstring: missing or invalid length prefix: {}".format(data))
try:

View File

@@ -41,7 +41,7 @@ class ASCommandResponse:
raise ValueError("Empty WBXML body passed")
except Exception as e:
self.xmlString = None
raise ValueError("Error: {0}".format(e))
raise ValueError("Error: {0}".format(e.message))
def getWBXMLBytes(self):
return self.wbxmlBytes

View File

@@ -40,7 +40,7 @@ class ASWBXMLByteQueue(Queue):
Queue.__init__(self)
for byte in wbxmlBytes:
self.put(byte)
self.put(ord(byte))
self.bytesEnqueued += 1

View File

@@ -39,12 +39,12 @@ class ASWBXMLCodePage:
self.tagLookup[tag] = token
def getToken(self, tag):
if tag in self.tagLookup:
if self.tagLookup.has_key(tag):
return self.tagLookup[tag]
return 0xFF
def getTag(self, token):
if token in self.tokenLookup:
if self.tokenLookup.has_key(token):
return self.tokenLookup[token]
return None

View File

@@ -1,3 +1,4 @@
import functools
import queue
from mitmproxy import exceptions
@@ -13,8 +14,8 @@ class Channel:
def ask(self, mtype, m):
"""
Decorate a message with a reply attribute, and send it to the master.
Then wait for a response.
Decorate a message with a reply attribute, and send it to the
master. Then wait for a response.
Raises:
exceptions.Kill: All connections should be closed immediately.
@@ -35,42 +36,84 @@ class Channel:
def tell(self, mtype, m):
"""
Decorate a message with a dummy reply attribute, send it to the master,
then return immediately.
Decorate a message with a dummy reply attribute, send it to the
master, then return immediately.
"""
m.reply = DummyReply()
self.q.put((mtype, m))
def handler(f):
@functools.wraps(f)
def wrapper(master, message):
if not hasattr(message, "reply"):
raise exceptions.ControlException("Message %s has no reply attribute" % message)
# DummyReplys may be reused multiple times.
# We only clear them up on the next handler so that we can access value and
# state in the meantime.
if isinstance(message.reply, DummyReply):
message.reply.reset()
# The following ensures that inheritance with wrapped handlers in the
# base class works. If we're the first handler, then responsibility for
# acking is ours. If not, it's someone else's and we ignore it.
handling = False
# We're the first handler - ack responsibility is ours
if message.reply.state == "unhandled":
handling = True
message.reply.handle()
with master.handlecontext():
ret = f(master, message)
if handling:
master.addons(f.__name__, message)
# Reset the handled flag - it's common for us to feed the same object
# through handlers repeatedly, so we don't want this to persist across
# calls.
if handling and message.reply.state == "handled":
message.reply.take()
if not message.reply.has_message:
message.reply.ack()
message.reply.commit()
# DummyReplys may be reused multiple times.
if isinstance(message.reply, DummyReply):
message.reply.mark_reset()
return ret
# Mark this function as a handler wrapper
wrapper.__dict__["__handler"] = True
return wrapper
NO_REPLY = object() # special object we can distinguish from a valid "None" reply.
class Reply:
"""
Messages sent through a channel are decorated with a "reply" attribute. This
object is used to respond to the message through the return channel.
Messages sent through a channel are decorated with a "reply" attribute.
This object is used to respond to the message through the return
channel.
"""
def __init__(self, obj):
self.obj = obj
self.q = queue.Queue() # type: queue.Queue
self._state = "start" # "start" -> "taken" -> "committed"
# Holds the reply value. May change before things are actually commited.
self.value = NO_REPLY
self._state = "unhandled" # "unhandled" -> "handled" -> "taken" -> "committed"
self.value = NO_REPLY # holds the reply value. May change before things are actually commited.
@property
def state(self):
"""
The state the reply is currently in. A normal reply object goes
sequentially through the following lifecycle:
The state the reply is currently in. A normal reply object goes sequentially through the following lifecycle:
1. start: Initial State.
2. taken: The reply object has been taken to be commited.
3. committed: The reply has been sent back to the requesting party.
1. unhandled: Initial State.
2. handled: The reply object has been handled by the topmost handler function.
3. taken: The reply object has been taken to be commited.
4. committed: The reply has been sent back to the requesting party.
This attribute is read-only and can only be modified by calling one of
state transition functions.
This attribute is read-only and can only be modified by calling one of state transition functions.
"""
return self._state
@@ -78,43 +121,47 @@ class Reply:
def has_message(self):
return self.value != NO_REPLY
def handle(self):
"""
Reply are handled by controller.handlers, which may be nested. The first handler takes
responsibility and handles the reply.
"""
if self.state != "unhandled":
raise exceptions.ControlException("Reply is {}, but expected it to be unhandled.".format(self.state))
self._state = "handled"
def take(self):
"""
Scripts or other parties make "take" a reply out of a normal flow.
For example, intercepted flows are taken out so that the connection thread does not proceed.
"""
if self.state != "start":
raise exceptions.ControlException(
"Reply is {}, but expected it to be start.".format(self.state)
)
if self.state != "handled":
raise exceptions.ControlException("Reply is {}, but expected it to be handled.".format(self.state))
self._state = "taken"
def commit(self):
"""
Ultimately, messages are commited. This is done either automatically by
if the message is not taken or manually by the entity which called
.take().
Ultimately, messages are commited. This is done either automatically by the handler
if the message is not taken or manually by the entity which called .take().
"""
if self.state != "taken":
raise exceptions.ControlException(
"Reply is {}, but expected it to be taken.".format(self.state)
)
raise exceptions.ControlException("Reply is {}, but expected it to be taken.".format(self.state))
if not self.has_message:
raise exceptions.ControlException("There is no reply message.")
self._state = "committed"
self.q.put(self.value)
def ack(self, force=False):
if self.state not in {"start", "taken"}:
raise exceptions.ControlException(
"Reply is {}, but expected it to be start or taken.".format(self.state)
)
self.send(self.obj, force)
def kill(self, force=False):
self.send(exceptions.Kill, force)
def send(self, msg, force=False):
if self.state not in ("handled", "taken"):
raise exceptions.ControlException(
"Reply is {}, did not expect a call to .send().".format(self.state)
)
if self.has_message and not force:
raise exceptions.ControlException("There is already a reply message.")
self.value = msg
@@ -128,7 +175,7 @@ class Reply:
class DummyReply(Reply):
"""
A reply object that is not connected to anything. In contrast to regular
Reply objects, DummyReply objects are reset to "start" at the end of an
Reply objects, DummyReply objects are reset to "unhandled" at the end of an
handler so that they can be used multiple times. Useful when we need an
object to seem like it has a channel, and during testing.
"""
@@ -143,7 +190,7 @@ class DummyReply(Reply):
def reset(self):
if self._should_reset:
self._state = "start"
self._state = "unhandled"
self.value = NO_REPLY
def __del__(self):

View File

@@ -1,7 +1,2 @@
import mitmproxy.master # noqa
import mitmproxy.log # noqa
import mitmproxy.options # noqa
master = None # type: "mitmproxy.master.Master"
log = None # type: "mitmproxy.log.Log"
options = None # type: "mitmproxy.options.Options"

View File

@@ -32,10 +32,8 @@ Events = frozenset([
"configure",
"done",
"log",
"load",
"running",
"start",
"tick",
"update",
])

View File

@@ -93,22 +93,15 @@ class SetServerNotAllowedException(MitmproxyException):
pass
class CommandError(Exception):
pass
class OptionsError(MitmproxyException):
pass
class AddonManagerError(MitmproxyException):
class AddonError(MitmproxyException):
pass
class AddonHalt(MitmproxyException):
"""
Raised by addons to signal that no further handlers should handle this event.
"""
pass

188
mitmproxy/export.py Normal file
View File

@@ -0,0 +1,188 @@
import io
import json
import pprint
import re
import textwrap
from typing import Any
from mitmproxy import http
def _native(s):
if isinstance(s, bytes):
return s.decode()
return s
def dictstr(items, indent: str) -> str:
lines = []
for k, v in items:
lines.append(indent + "%s: %s,\n" % (repr(_native(k)), repr(_native(v))))
return "{\n%s}\n" % "".join(lines)
def curl_command(flow: http.HTTPFlow) -> str:
data = "curl "
request = flow.request.copy()
request.decode(strict=False)
for k, v in request.headers.items(multi=True):
data += "-H '%s:%s' " % (k, v)
if request.method != "GET":
data += "-X %s " % request.method
data += "'%s'" % request.url
if request.content:
data += " --data-binary '%s'" % _native(request.content)
return data
def python_arg(arg: str, val: Any) -> str:
if not val:
return ""
if arg:
arg += "="
arg_str = "{}{},\n".format(
arg,
pprint.pformat(val, 79 - len(arg))
)
return textwrap.indent(arg_str, " " * 4)
def python_code(flow: http.HTTPFlow):
code = io.StringIO()
def writearg(arg, val):
code.write(python_arg(arg, val))
code.write("import requests\n")
code.write("\n")
if flow.request.method.lower() in ("get", "post", "put", "head", "delete", "patch"):
code.write("response = requests.{}(\n".format(flow.request.method.lower()))
else:
code.write("response = requests.request(\n")
writearg("", flow.request.method)
url_without_query = flow.request.url.split("?", 1)[0]
writearg("", url_without_query)
writearg("params", list(flow.request.query.fields))
headers = flow.request.headers.copy()
# requests adds those by default.
for x in (":authority", "host", "content-length"):
headers.pop(x, None)
writearg("headers", dict(headers))
try:
if "json" not in flow.request.headers.get("content-type", ""):
raise ValueError()
writearg("json", json.loads(flow.request.text))
except ValueError:
writearg("data", flow.request.content)
code.seek(code.tell() - 2) # remove last comma
code.write("\n)\n")
code.write("\n")
code.write("print(response.text)")
return code.getvalue()
def locust_code(flow):
code = textwrap.dedent("""
from locust import HttpLocust, TaskSet, task
class UserBehavior(TaskSet):
def on_start(self):
''' on_start is called when a Locust start before any task is scheduled '''
self.{name}()
@task()
def {name}(self):
url = self.locust.host + '{path}'
{headers}{params}{data}
self.response = self.client.request(
method='{method}',
url=url,{args}
)
### Additional tasks can go here ###
class WebsiteUser(HttpLocust):
task_set = UserBehavior
min_wait = 1000
max_wait = 3000
""").strip()
name = re.sub('\W|^(?=\d)', '_', flow.request.path.strip("/").split("?", 1)[0])
if not name:
new_name = "_".join([str(flow.request.host), str(flow.request.timestamp_start)])
name = re.sub('\W|^(?=\d)', '_', new_name)
path_without_query = flow.request.path.split("?")[0]
args = ""
headers = ""
if flow.request.headers:
lines = [
(_native(k), _native(v)) for k, v in flow.request.headers.fields
if _native(k).lower() not in [":authority", "host", "cookie"]
]
lines = [" '%s': '%s',\n" % (k, v) for k, v in lines]
headers += "\n headers = {\n%s }\n" % "".join(lines)
args += "\n headers=headers,"
params = ""
if flow.request.query:
lines = [
" %s: %s,\n" % (repr(k), repr(v))
for k, v in
flow.request.query.collect()
]
params = "\n params = {\n%s }\n" % "".join(lines)
args += "\n params=params,"
data = ""
if flow.request.content:
data = "\n data = '''%s'''\n" % _native(flow.request.content)
args += "\n data=data,"
code = code.format(
name=name,
path=path_without_query,
headers=headers,
params=params,
data=data,
method=flow.request.method,
args=args,
)
return code
def locust_task(flow):
code = locust_code(flow)
start_task = len(code.split('@task')[0]) - 4
end_task = -19 - len(code.split('### Additional')[1])
task_code = code[start_task:end_task]
return task_code
def url(flow):
return flow.request.url
EXPORTERS = [
("content", "c", None),
("headers+content", "h", None),
("url", "u", url),
("as curl command", "r", curl_command),
("as python code", "p", python_code),
("as locust code", "l", locust_code),
("as locust task", "t", locust_task),
]

View File

@@ -78,7 +78,7 @@ class Flow(stateobject.StateObject):
self._backup = None # type: typing.Optional[Flow]
self.reply = None # type: typing.Optional[controller.Reply]
self.marked = False # type: bool
self.metadata = dict() # type: typing.Dict[str, typing.Any]
self.metadata = dict() # type: typing.Dict[str, str]
_stateobject_attributes = dict(
id=str,
@@ -93,7 +93,7 @@ class Flow(stateobject.StateObject):
def get_state(self):
d = super().get_state()
d.update(version=version.FLOW_FORMAT_VERSION)
d.update(version=version.IVERSION)
if self._backup and self._backup != d:
d.update(backup=self._backup)
return d
@@ -112,9 +112,8 @@ class Flow(stateobject.StateObject):
def copy(self):
f = super().copy()
f.id = str(uuid.uuid4())
f.live = False
if self.reply is not None:
f.reply = controller.DummyReply()
return f
def modified(self):
@@ -144,7 +143,7 @@ class Flow(stateobject.StateObject):
@property
def killable(self):
return self.reply and self.reply.state == "taken"
return self.reply and self.reply.state in {"handled", "taken"}
def kill(self):
"""
@@ -152,9 +151,8 @@ class Flow(stateobject.StateObject):
"""
self.error = Error("Connection killed")
self.intercepted = False
# reply.state should be "taken" here, or .take() will raise an
# exception.
# reply.state should only be "handled" or "taken" here.
# if none of this is the case, .take() will raise an exception.
if self.reply.state != "taken":
self.reply.take()
self.reply.kill(force=True)

View File

@@ -44,7 +44,7 @@ from mitmproxy import flow
from mitmproxy.utils import strutils
import pyparsing as pp
from typing import Callable, Sequence, Type # noqa
from typing import Callable
def only(*types):
@@ -69,8 +69,6 @@ class _Token:
class _Action(_Token):
code = None # type: str
help = None # type: str
@classmethod
def make(klass, s, loc, toks):
@@ -164,14 +162,15 @@ def _check_content_type(rex, message):
class FAsset(_Action):
code = "a"
help = "Match asset in response: CSS, Javascript, Flash, images."
ASSET_TYPES = [re.compile(x) for x in [
ASSET_TYPES = [
b"text/javascript",
b"application/x-javascript",
b"application/javascript",
b"text/css",
b"image/.*",
b"application/x-shockwave-flash"
]]
]
ASSET_TYPES = [re.compile(x) for x in ASSET_TYPES]
@only(http.HTTPFlow)
def __call__(self, f):
@@ -320,14 +319,10 @@ class FDomain(_Rex):
code = "d"
help = "Domain"
flags = re.IGNORECASE
is_binary = False
@only(http.HTTPFlow)
def __call__(self, f):
return bool(
self.re.search(f.request.host) or
self.re.search(f.request.pretty_host)
)
return bool(self.re.search(f.request.data.host))
class FUrl(_Rex):
@@ -344,7 +339,7 @@ class FUrl(_Rex):
@only(http.HTTPFlow)
def __call__(self, f):
return self.re.search(f.request.pretty_url)
return self.re.search(f.request.url)
class FSrc(_Rex):
@@ -353,10 +348,7 @@ class FSrc(_Rex):
is_binary = False
def __call__(self, f):
if not f.client_conn or not f.client_conn.address:
return False
r = "{}:{}".format(f.client_conn.address[0], f.client_conn.address[1])
return f.client_conn.address and self.re.search(r)
return f.client_conn.address and self.re.search(repr(f.client_conn.address))
class FDst(_Rex):
@@ -365,10 +357,7 @@ class FDst(_Rex):
is_binary = False
def __call__(self, f):
if not f.server_conn or not f.server_conn.address:
return False
r = "{}:{}".format(f.server_conn.address[0], f.server_conn.address[1])
return f.server_conn.address and self.re.search(r)
return f.server_conn.address and self.re.search(repr(f.server_conn.address))
class _Int(_Action):
@@ -436,8 +425,7 @@ filter_unary = [
FReq,
FResp,
FTCP,
FWebSocket,
] # type: Sequence[Type[_Action]]
]
filter_rex = [
FBod,
FBodRequest,
@@ -453,7 +441,7 @@ filter_rex = [
FMethod,
FSrc,
FUrl,
] # type: Sequence[Type[_Rex]]
]
filter_int = [
FCode
]
@@ -539,17 +527,17 @@ def match(flt, flow):
help = []
for a in filter_unary:
for i in filter_unary:
help.append(
("~%s" % a.code, a.help)
("~%s" % i.code, i.help)
)
for b in filter_rex:
for i in filter_rex:
help.append(
("~%s regex" % b.code, b.help)
("~%s regex" % i.code, i.help)
)
for c in filter_int:
for i in filter_int:
help.append(
("~%s int" % c.code, c.help)
("~%s int" % i.code, i.help)
)
help.sort()
help.extend(

View File

@@ -5,6 +5,7 @@ from mitmproxy import flow
from mitmproxy.net import http
from mitmproxy import version
from mitmproxy.net import tcp
from mitmproxy import connections # noqa
@@ -52,7 +53,9 @@ class HTTPRequest(http.Request):
def get_state(self):
state = super().get_state()
state["is_replay"] = self.is_replay
state.update(
is_replay=self.is_replay
)
return state
def set_state(self, state):
@@ -165,12 +168,11 @@ class HTTPFlow(flow.Flow):
""" What mode was the proxy layer in when receiving this request? """
_stateobject_attributes = flow.Flow._stateobject_attributes.copy()
# mypy doesn't support update with kwargs
_stateobject_attributes.update(dict(
_stateobject_attributes.update(
request=HTTPRequest,
response=HTTPResponse,
mode=str
))
)
def __repr__(self):
s = "<HTTPFlow"
@@ -222,7 +224,8 @@ def make_error_response(
status_code=status_code,
reason=reason,
message=html.escape(message),
).encode("utf8", "replace")
)
body = body.encode("utf8", "replace")
if not headers:
headers = http.Headers(
@@ -242,8 +245,9 @@ def make_error_response(
def make_connect_request(address):
address = tcp.Address.wrap(address)
return HTTPRequest(
"authority", b"CONNECT", None, address[0], address[1], None, b"HTTP/1.1",
"authority", b"CONNECT", None, address.host, address.port, None, b"HTTP/1.1",
http.Headers(), b""
)

View File

@@ -1,5 +1,5 @@
import os
from typing import Type, Iterable, Dict, Union, Any, cast # noqa
from typing import Iterable
from mitmproxy import exceptions
from mitmproxy import flow
@@ -7,15 +7,15 @@ from mitmproxy import flowfilter
from mitmproxy import http
from mitmproxy import tcp
from mitmproxy import websocket
from mitmproxy.contrib import tnetstring
from mitmproxy import io_compat
from mitmproxy.io import compat
from mitmproxy.io import tnetstring
FLOW_TYPES = dict(
http=http.HTTPFlow,
websocket=websocket.WebSocketFlow,
tcp=tcp.TCPFlow,
) # type: Dict[str, Type[flow.Flow]]
)
class FlowWriter:
@@ -37,18 +37,14 @@ class FlowReader:
"""
try:
while True:
# FIXME: This cast hides a lack of dynamic type checking
loaded = cast(
Dict[Union[bytes, str], Any],
tnetstring.load(self.fo),
)
data = tnetstring.load(self.fo)
try:
mdata = compat.migrate_flow(loaded)
data = io_compat.migrate_flow(data)
except ValueError as e:
raise exceptions.FlowReadException(str(e))
if mdata["type"] not in FLOW_TYPES:
raise exceptions.FlowReadException("Unknown flow type: {}".format(mdata["type"]))
yield FLOW_TYPES[mdata["type"]].from_state(mdata)
if data["type"] not in FLOW_TYPES:
raise exceptions.FlowReadException("Unknown flow type: {}".format(data["type"]))
yield FLOW_TYPES[data["type"]].from_state(data)
except ValueError as e:
if str(e) == "not a tnetstring: empty file":
return # Error is due to EOF

Some files were not shown because too many files have changed in this diff Show More