Compare commits

...

422 Commits

Author SHA1 Message Date
Thomas Kriechbaumer
5654b6b82d Update certinstall.rst 2017-05-05 14:26:19 +02:00
Zhigang Fang
6c90d1361b Update style 2017-05-05 17:12:17 +08:00
Zhigang Fang
5962da4bbe Update spacing 2017-05-05 17:11:02 +08:00
Zhigang Fang
f946ed109b Add note for iOS version 10.3 or up 2017-05-05 17:10:14 +08:00
Maximilian Hils
d734f6bbd6 Merge pull request #2286 from charlesdhdt/master
Added LDAP Auth
2017-05-04 16:24:57 +02:00
Charles d'Hondt
154e8ac0fc fixed lint 2017-05-04 13:39:48 +02:00
Charles d'Hondt
f67d9adc31 Added ldap mock test 2017-05-04 13:25:15 +02:00
Maximilian Hils
40f387eb48 Merge pull request #2311 from mhils/issue-2310
fix #2310
2017-05-03 18:41:07 +02:00
Maximilian Hils
315daa042a fix #2310 2017-05-03 17:38:17 +02:00
Aldo Cortesi
822797c7e0 Merge pull request #2307 from cortesi/layouts
console: add a two-pane layout
2017-05-03 15:46:56 +12:00
Aldo Cortesi
2659b52209 console: add a two-pane layout
- Replace options.console_eventlog with options.console_layout
- This can be "single", "vertical" and "horizontal"
- At the base of the primary pane is the flowlist. At the base of the secondary
pane is the event log.
- Any of the other primary windows can be opened in each of the panes.

For now, I've bound "-" to the flow layout switch, "shift tab" to the layout
pane switch, and "P" to open the currently focused flow in whichever pane
you're in. These are just temporary  - we'll reassess the default bindings
carefully once the keybindings work is complete.
2017-05-03 14:55:02 +12:00
Maximilian Hils
e24ff261e7 Merge pull request #2306 from mhils/web-cov
Honest JS Coverage
2017-05-02 12:44:02 +02:00
Aldo Cortesi
0f4d94b31c console: Make EventLog its own control, bind to "E" 2017-05-02 22:42:04 +12:00
Maximilian Hils
63ffd9a3a9 honest js coverage 2017-05-02 11:34:42 +02:00
Charles d'Hondt
a68808294d lint checks 2017-05-02 10:35:56 +02:00
Charles d'Hondt
3f094a54ce added ldap3 to setup.py 2017-05-02 10:18:49 +02:00
Aldo Cortesi
9d2350b670 Merge pull request #2305 from cortesi/consolebugs
console: various bugfixes
2017-05-02 13:09:27 +12:00
Aldo Cortesi
dcae79e017 console: various bugs
- Action bar now correctly gets the palette background
- Viewing connection details for a killed flow no longer crashes
- Editing options with selections no longer crashes
2017-05-02 12:56:27 +12:00
Aldo Cortesi
4e39d387d3 commands: options control
Use the new commands to bind the Options view in console.
2017-05-02 12:18:24 +12:00
Aldo Cortesi
9e58c36639 console: globally respected nav commands
Use this to bind nav keys with commands throughout.
2017-05-02 12:18:24 +12:00
Aldo Cortesi
8d29492960 Merge pull request #2299 from iharsh234/mypy-pathod
Mypy Checking to pathod
2017-05-02 12:17:57 +12:00
harsh vijay
e24b4cc1b6 Extend Mypy checking to pathod
* mypy checking pathod

* initial commit , fixed errors

* tox: mypy checking to pathod

* Fixed mypy test failed

* issue was with args in custom_contentview.py

* tox: mypy checking to #2221

* follow-import=skip since we cant provide args to custom_contentview.py during mypy testing

* Lint , Typo Fixed

* code style: module import
2017-05-02 05:19:25 +05:30
Aldo Cortesi
53ad658e9f Merge pull request #2302 from cortesi/flowview
commands: view.setval, view.getval, view.setval.toggle
2017-05-01 20:17:24 +12:00
Aldo Cortesi
2f52d8b9ad commands: view.setval, view.getval, view.setval.toggle
Use these to bind the last of the key for flowview. Flow views are now 100%
keybound.
2017-05-01 19:52:24 +12:00
Aldo Cortesi
143872b574 Merge pull request #2301 from cortesi/encoding
commands: flow.encode, flow.decode, flow.encode.toggle
2017-05-01 19:52:00 +12:00
Aldo Cortesi
7d56ec9d49 commands: flow.encode, flow.decode, flow.encode.toggle
Use this to bind "z" in flowview.
2017-05-01 18:31:22 +12:00
Aldo Cortesi
06c99bffc3 Merge pull request #2300 from cortesi/consolerevamp
Console revamp
2017-05-01 18:29:44 +12:00
Aldo Cortesi
542a998174 commands: console.bodyview, console.choose
Use this to replace the "v" for view shortcut key in flowview.
2017-05-01 17:56:21 +12:00
Aldo Cortesi
1ea4a5a48e keymap: keys can now bind to multiple contexts
Use this to map the majority of the keys in flowview.
2017-05-01 16:52:58 +12:00
Aldo Cortesi
670d1e408b command: flow.set
Use this to replace the flow edit components of flowview entirely.
2017-05-01 14:56:57 +12:00
Aldo Cortesi
46373977e2 Refactor console
- All top-level window objects are now persistent. It turns out that urwid
keeps a reference to these even after they're no longer used, so they were
leaking before.
- Revamp editors to work with console commands, and start reworking bindings
for flowview.
2017-05-01 14:12:22 +12:00
Maximilian Hils
288448c575 Merge pull request #2285 from MatthewShao/jest-dev
[web] Add coverage for js/urlState.js
2017-04-30 18:26:41 +02:00
Matthew Shao
365677006c [web] Export updateUrlFromStore and updateStoreFromUrl 2017-04-30 22:40:58 +08:00
Maximilian Hils
fccc9bc45e Merge pull request #2271 from mhils/mitmweb-connection-indicator
[web] add connection indicator [WIP]
2017-04-30 15:23:20 +02:00
Aldo Cortesi
29ce5a83d2 Merge pull request #2297 from cortesi/flowcmd
commands: view.order.options, view.marked.toggle, view.create
2017-05-01 00:02:15 +12:00
Matthew Shao
a7feced5de [web] Introduce redux-mock-store and minor fix. 2017-04-30 20:00:36 +08:00
Aldo Cortesi
685487d33c commands: view.order.options, view.marked.toggle, view.create
And use these commands to remove the last hard-coded keybindings from flow
list. This means the flow list is now 100% command-driven, which is very
exciting.
2017-04-30 22:45:31 +12:00
Aldo Cortesi
b10a3992d8 Merge pull request #2296 from cortesi/viewload
commands: view.load
2017-04-30 22:30:29 +12:00
Aldo Cortesi
a570caccbd commands: view.load
Plus replace the flow list keybinding.
2017-04-30 22:02:29 +12:00
Aldo Cortesi
3cd93567f5 commands: support *args for commands
Use this to simplify meta-commands in console, and to create a console_choose
command that prompts the user for a choice, and then executes a command with
variable substitution.
2017-04-30 21:24:00 +12:00
Thomas Kriechbaumer
82f87919e2 fix markdown 2017-04-30 11:21:28 +02:00
Thomas Kriechbaumer
8a4a46e7d3 update docker-release tags during release 2017-04-30 11:19:55 +02:00
Aldo Cortesi
bcbe87bb09 Move export to addon, kill Python and Locust export
Also add a "raw" export format.

The Python and Locust exports are hard to maintain, their tests are extremely
brittle, they didn't have full test coverage, and are by my guess very rarely
used. I feel the Locust export should certainly be an externally maintained
addon. The Python/requests export can come back if someone cares enough, and it
can be structured in a way we can maintain.
2017-04-30 17:53:30 +12:00
Aldo Cortesi
50cfededce Merge pull request #2293 from cortesi/clip
commands: cut.clip copies cuts to system clipboard
2017-04-30 17:37:45 +12:00
Aldo Cortesi
befbe88d96 commands: cut.clip copies cuts to system clipboard 2017-04-30 14:42:37 +12:00
Aldo Cortesi
51b50737e7 Merge pull request #2291 from cortesi/cuts
Introduce cuts: a flow dissector
2017-04-30 14:35:06 +12:00
Aldo Cortesi
ed62bbad1d Add help text to --commands output 2017-04-30 14:20:21 +12:00
Aldo Cortesi
075d452a6d cut: more flexible cut specification based on attribute paths
Also support certificate types, which are converted to ASCII-encoded PEM format.
2017-04-30 14:05:45 +12:00
Aldo Cortesi
7ffb2c7981 cut: use csv module to encode multi-values for saving
Also add q.text, q.raw_content, s.text, s.raw_content selectors
2017-04-30 11:48:32 +12:00
Maximilian Hils
97a00728a8 [web] add connection tests 2017-04-29 19:43:59 +02:00
Maximilian Hils
8f1b763082 [web] add connection indicator 2017-04-29 19:34:51 +02:00
Maximilian Hils
aa0e134041 Merge pull request #2276 from mitmproxy/add-examples-to-script-docs
Prominently link to examples in script docs
2017-04-29 19:23:21 +02:00
Aldo Cortesi
4b568f99d6 Introduce cuts: a flow dissector
This PR introduces the cuts addon, a flow dissector that  allows you to select
and operate on specific components of flows. It also adds the first consumer
for cuts - the cuts.save command.

Save the content of the focus to /tmp/foo:

    cuts.save s.content|@focus /tmp/foo

Save the URL and response content-type headers for all flows currently shown to
file, comma-separated, one flow per line:

    cuts.save s.url,q.header[content-type]|@focus /tmp/foo

We also use this to replace the body save shortcut in the console flowlist.
2017-04-29 23:32:31 +12:00
Aldo Cortesi
83f8760703 Merge pull request #2221 from iharsh234/mypy-master-example/simple
Extend mypy example/simple (#2194)
2017-04-29 23:19:11 +12:00
Aldo Cortesi
25fb0cc019 Merge pull request #2290 from cortesi/scriptrun
command: script.run
2017-04-29 13:48:17 +12:00
Aldo Cortesi
d439b34511 command: script.run
Plus the flowlist binding adjustments.
2017-04-29 13:24:19 +12:00
Aldo Cortesi
915bc7647d Merge pull request #2289 from cortesi/flowrevert
command: flow.revert
2017-04-29 12:30:23 +12:00
Aldo Cortesi
a32f87fef9 Merge pull request #2288 from cortesi/replayfiles
commands: "replay.client.file" and "replay.server.file"
2017-04-29 11:52:38 +12:00
Aldo Cortesi
1d5eedcc9c command: flow.revert
Plus matching binding in flowlist.
2017-04-29 11:48:25 +12:00
Aldo Cortesi
f908ea220f commands: "replay.client.file" and "replay.server.file" 2017-04-29 11:34:50 +12:00
Aldo Cortesi
c7247e026d Merge pull request #2287 from cortesi/cmdmark
commands: marking
2017-04-29 11:23:19 +12:00
Aldo Cortesi
a92017a6c1 Rework client and server replay
- Add client.replay [flows], client.replay.stop
- Add server.replay [flows], server.replay.stop
- The corresponding options for file loading are only read on startup, further
changes are ignored. In interactive contexts, replay is started with the
commands, not through option changes.
- Deprecate flow.replay, use replay.client instead
2017-04-29 11:02:36 +12:00
harsh vijay
789fbd00d2 Fixed lint error 2017-04-29 03:54:00 +05:30
harsh vijay
6e03231d25 lint error fixed 2017-04-29 03:49:08 +05:30
Aldo Cortesi
7317ea134e command: flow.kill, flow.replay
Plus the matching bindings in the flow list.
2017-04-29 09:58:32 +12:00
harsh vijay
36118973d9 extend mypy example/simple 2017-04-29 03:26:14 +05:30
Aldo Cortesi
f21a970f29 commands: marking
Add "view.mark [flows] bool" and "view.mark.toggle [flows]". Use this to rebind
marking keys in flowlist.
2017-04-29 09:14:44 +12:00
Aldo Cortesi
139c4e6db3 Merge pull request #2283 from cortesi/cmdall
Commands, core update event
2017-04-29 09:14:22 +12:00
Aldo Cortesi
217addbf31 commands: view.go
bind G to "view.go -1"
bind g to "view.go 0"
2017-04-29 08:40:38 +12:00
Aldo Cortesi
0b090f7ae1 Commands, core update event
This patch:

- Introduces a core update() event that should be invoked whenever flows are
changed outside of the normal lifecycle.
- Extend view.resolve to know about @all, which matches all flows in the view.
- Add a core flow.resume comand, which resumes flows and broadcasts an update event.
- Define flow list bindings for:

A -> flow.resume @all
a -> flow.resume @focus
d -> view.remove @focus
z -> view.remove @all
2017-04-29 08:01:00 +12:00
Charles d'Hondt
65202f5f1c Added ldapsss vef 2017-04-28 17:00:21 +02:00
Charles d'Hondt
6d3837fd54 fix 2017-04-28 16:44:50 +02:00
Charles d'Hondt
f91ed91bf1 fix 2017-04-28 16:40:34 +02:00
Charles d'Hondt
29c1f303d6 Fixed typo 2017-04-28 16:23:32 +02:00
Charles d'Hondt
5fb18ad275 Added LDAP Auth 2017-04-28 16:19:27 +02:00
Matthew Shao
f8b76a62ff [web] Add coverage for js/urlState.js 2017-04-28 22:06:17 +08:00
Maximilian Hils
b537997f4f Update README.md 2017-04-28 14:43:49 +02:00
Maximilian Hils
05e1f22143 Update README.md 2017-04-28 14:24:57 +02:00
Maximilian Hils
bd88733a0a update CHANGELOG 2017-04-28 14:22:56 +02:00
Maximilian Hils
769123498c fix mitmweb's content security policy 2017-04-28 14:14:56 +02:00
Aldo Cortesi
a781bab7db Merge pull request #2282 from cortesi/cmddec
Add a command.command decorator
2017-04-28 15:33:00 +12:00
Aldo Cortesi
6af1a49464 commands: add a command.command decorator
Use this for our built-ins and the console commands.
2017-04-28 15:07:52 +12:00
Aldo Cortesi
7ff84673fd Support unary flow command arguments
Use this to add a console.view.flow command and bind "enter" in flowlist.
2017-04-28 14:05:35 +12:00
Aldo Cortesi
0ff0253b3e console: save flows to command 2017-04-28 12:15:28 +12:00
Aldo Cortesi
cfae95f5c3 console: start migrating hotkeys to keymap
This shifts a set of flow list keys to keymaps. Much more to come.
2017-04-28 12:10:48 +12:00
Aldo Cortesi
cbb35cee51 Merge pull request #2280 from cortesi/coreset
commands: add the core command addon, and the command "set"
2017-04-28 12:10:02 +12:00
Aldo Cortesi
ce01cb9c09 command: partials
Support partial commands in console.command, use it to keybind "i" for
intercept, which prompts the user with ":set intercept=".
2017-04-28 11:24:28 +12:00
Aldo Cortesi
29b3e787ca Merge pull request #2279 from cortesi/keymap
console: keymap
2017-04-28 10:43:05 +12:00
Aldo Cortesi
8a07059cf4 commands: add the core command addon, and the command "set"
The set command sets an option using the same syntax as commandline --set.
2017-04-28 10:41:44 +12:00
Aldo Cortesi
be1b76b975 console: add a keymap
This sketches out a keymap system for consone, and adds the first few top-level
commands and mappings.
2017-04-28 10:21:15 +12:00
Aldo Cortesi
18edc11145 console: interactive command browser
Simple browser that lets users view and select commands interactively.
Key binding for this is still to be sorted out.
2017-04-28 07:43:58 +12:00
Aldo Cortesi
b73b06b364 Merge pull request #2274 from cortesi/alerts
Add an "alert" log level.
2017-04-28 06:53:51 +12:00
Aldo Cortesi
aab6bf747c Merge pull request #2273 from cortesi/commands
Commands
2017-04-28 06:53:41 +12:00
Thomas Kriechbaumer
adce1a8235 Merge pull request #2277 from Kriechi/bump-cryptography
bump cryptography min-version
2017-04-27 20:11:15 +02:00
Thomas Kriechbaumer
3e82f4d6c4 bump cryptography min-version
This removes OpenSSL v0.9.8 support.
2017-04-27 19:39:56 +02:00
Maximilian Hils
5f91b7a45d Merge pull request #2275 from mitmproxy/mitmweb-change-react-prop-types
change import of prop-types
2017-04-27 18:29:57 +02:00
Maximilian Hils
75d952c46c prominently link to examples in script docs 2017-04-27 18:23:36 +02:00
cle1000
115a31c9ba change import of prop-types 2017-04-27 10:54:21 +02:00
Aldo Cortesi
218e127e74 Add an "alert" log level.
This has the same urgency as "info", but also signals to interactive tools that
the user's attention should be drawn to the output, even if they're not looking
at the event log. In the console app, this means the message appears in the
status bar with a timeout.
2017-04-27 19:08:01 +12:00
Aldo Cortesi
fde1159ae3 Test coverage ++ 2017-04-27 18:43:44 +12:00
Aldo Cortesi
97000aa85c command: save.file flowspec path -> None
Our first user-facing command. The following commands do the obvious things:

save.file @marked /tmp/flows
save.file @focus /tmp/flows
save.file @hidden /tmp/flows
save.file "~m get" /tmp/flows
2017-04-27 17:05:00 +12:00
Aldo Cortesi
b7afcb5dc2 addons.streamfile -> addons.save
Options:

    streamfile -> save_stream_file
    streamfile_filter -> save_stream_filter
2017-04-27 15:58:54 +12:00
Aldo Cortesi
8c4810f606 console: flow resolution command
This is our first built-in command, which will be used by very many other
commands.

Also add a --commands option to dump all commands, analogous to --options.
2017-04-27 15:27:51 +12:00
Aldo Cortesi
ee3dd3f3c5 console: very basic interactive command executor 2017-04-27 12:00:41 +12:00
Aldo Cortesi
169baabcab Basic outline of the command subsystem
- Add a command manager
- Sketch out the type system with a few simple supported types
2017-04-27 11:09:40 +12:00
Aldo Cortesi
2a46f3851a Merge pull request #2265 from cortesi/addons
Addons and addon testing
2017-04-27 07:40:14 +12:00
Maximilian Hils
ab07b79138 Merge pull request #2270 from F1ashhimself/master
Update readme for complex examples
2017-04-26 17:15:34 +02:00
Maksim Beloborodko
90a5b90b0d Update readme for complex examples 2017-04-26 17:14:41 +03:00
Maximilian Hils
0f9081b18d Merge pull request #2269 from mhils/issue-2250
fix #2250, add type info to cookie module
2017-04-26 14:42:48 +02:00
Maximilian Hils
87610cc8b2 fix #2250, add type info to cookie module 2017-04-26 14:17:14 +02:00
Maximilian Hils
d5ea08db62 Merge pull request #2258 from mhils/readfile
Integrate readstdin into readfile
2017-04-26 13:40:51 +02:00
Maximilian Hils
0a8e54edea Merge pull request #2268 from mhils/issue-2257
fix #2257
2017-04-26 13:39:19 +02:00
Maximilian Hils
cef01ac164 Merge pull request #2267 from MatthewShao/jest-dev
[web] Add coverage for js/utils.js, js/flow/utils.js
2017-04-26 13:38:54 +02:00
Maximilian Hils
1aa6d9d324 fix #2257 2017-04-26 12:26:49 +02:00
Maximilian Hils
ca2827886a separate reading from stdin into its own addon 2017-04-26 12:11:37 +02:00
Maximilian Hils
7607240c30 Merge pull request #2259 from mhils/2228
fix #2228
2017-04-26 12:10:39 +02:00
Maximilian Hils
df7701bb6d fix #2228 2017-04-26 11:32:24 +02:00
Maximilian Hils
07cb83597b clarify that --cert expects a leaf certificate
refs #2254
2017-04-26 11:18:27 +02:00
Matthew Shao
ec7d90f9be [web] Add coverage for js/utils.js, js/flow/utils.js 2017-04-26 17:06:50 +08:00
Maximilian Hils
b3a1143338 integrate readstdin into readfile addon 2017-04-26 10:56:32 +02:00
Maximilian Hils
2b500f234f typecheck: add support for typing.Any 2017-04-26 10:54:36 +02:00
Aldo Cortesi
5327756377 Addons and addon testing
- Fix some loading sequence bugs affecting command-line script invocation
- Allow addons to over-ride existing options (with a warning). We need this for
reloading.
- Convert har_dump to new-style arguments, fix and re-instate its test suite.
- Covnert miscelaneous other exmples to new-style args.
2017-04-26 19:56:33 +12:00
Aldo Cortesi
e32efcae49 Merge pull request #2262 from Kriechi/reduced-coverage++
improve tests (extracted from #2011)
2017-04-26 19:52:33 +12:00
Aldo Cortesi
1daf0b3f0a Merge pull request #2264 from cortesi/options
Change the way addons handle options
2017-04-26 11:23:48 +12:00
Aldo Cortesi
b72f139093 configure(options, updated) -> configure(updated)
Options are now available globally on ctx, so the first argument of configure
is redundant.
2017-04-26 11:01:27 +12:00
Aldo Cortesi
f90b4c2ff0 Move options into ctx
Many addons currently save options on configure(), either as individual options
or sometimes by saving the entire options object. The current options should
simply be available on the ctx object, simplifying state management for addons
considerably.
2017-04-26 10:25:56 +12:00
Aldo Cortesi
02c82b1b60 Merge pull request #2261 from cortesi/addonrevamp
Revamp how addons work
2017-04-26 09:03:03 +12:00
Aldo Cortesi
7aa2081894 Remove watchdog, solidify script testing
- Remove the watchdog dependency. We now just stat the script file every 2
seconds to check for an updated mtime.
- Further solidify our script testing, and in particular make the example tests
nicer. These should exemplify how we want users to test their own addon
scripts. More work on addon testing to follow.
2017-04-26 09:01:40 +12:00
Thomas Kriechbaumer
a35a377cbb improve tests 2017-04-25 20:49:49 +02:00
Maximilian Hils
7ee0abbe0c Merge pull request #2219 from MatthewShao/jest-dev
[web] Coverage++ for ducks/flows.js
2017-04-25 20:26:45 +02:00
Matthew Shao
6962a2c3f2 Fix the tests for flows actions. 2017-04-25 19:29:52 +08:00
Matthew Shao
dcac976a47 Merge branch 'jest-dev' of https://github.com/MatthewShao/mitmproxy into jest-dev 2017-04-25 19:27:53 +08:00
Aldo Cortesi
e6eeab6094 Revamp how addons work
- Addons now nest, which means that addons can manage addons. This has a number
of salutary effects - the scripts addon no longer has to poke into the global
addons list, we no longer have to replace/remove/boot-outof parent addons when
we load scripts, and this paves the way for making our top-level tools into
addons themselves.
- All addon calls are now wrapped in a safe execution environment where
exceptions are caught, and output to stdout/stderr are intercepted and turned
into logs.
- We no longer support script arguments in sys.argv - creating an option
properly is the only way to pass arguments. This means that all scripts are
always directly controllable from interctive tooling, and that arguments are
type-checked.

For now, I've disabled testing of the har dump example - it needs to be moved
to the new argument handling, and become a class addon. I'll address that in a
separate patch.
2017-04-25 22:13:44 +12:00
Maximilian Hils
161cdff25e simplify selectRelative, add example for action testing 2017-04-24 15:13:45 +02:00
Matthew Shao
ae71ec1d30 [web] Coverage++ for ducks/flows.js 2017-04-24 14:18:12 +02:00
Thomas Kriechbaumer
90c425bd14 Merge pull request #2256 from ujjwal96/ip-formatting
Fixed formatting in mitmdump
2017-04-22 10:33:07 +02:00
Ujjwal Verma
73692b909b Fixed formatting in mitmdump 2017-04-22 11:24:22 +05:30
Maximilian Hils
6af72160bf [requires.io] dependency update on master branch (#2252) 2017-04-20 15:29:42 +02:00
Maximilian Hils
3d7cde058b [requires.io] dependency update on master branch (#2241) 2017-04-17 10:49:17 +02:00
Thomas Kriechbaumer
9c18f0ba09 Merge pull request #2240 from mitmproxy/requires-io-master
[requires.io] dependency update on master branch
2017-04-16 22:12:04 +02:00
Maximilian Hils
3aa7f24992 Update setup.py 2017-04-16 19:44:26 +02:00
requires.io
3afeb7c7e9 [requires.io] dependency update 2017-04-16 19:42:32 +02:00
Maximilian Hils
be5e276d45 Merge pull request #2239 from Kriechi/update-kaitai
update kaitai and recompile formats
2017-04-16 19:42:19 +02:00
Thomas Kriechbaumer
83a428e0b7 manually fix imports 2017-04-16 15:51:12 +02:00
Thomas Kriechbaumer
dadb50946e update kaitai and recompile formats 2017-04-16 15:51:11 +02:00
Maximilian Hils
ec6be96f2a Merge pull request #2234 from ujjwal96/ip-formatting
Fixes IP address formatting #2109
2017-04-10 20:24:15 +02:00
Ujjwal Verma
742127ef7b Fixes #2109 2017-04-09 18:55:20 +05:30
Maximilian Hils
c7b5012752 Merge pull request #2232 from r1b/master
fixes ipv6 authority form parsing in CONNECT
2017-04-09 01:15:50 +02:00
Robert C Jensen
7365f18542 fixes ipv6 authority form parsing in CONNECT 2017-04-07 15:46:21 -04:00
Maximilian Hils
c76620c19f Merge pull request #2227 from vinaydargar/grammarphrasing
Replace phrase 'the the' in 2 places
2017-04-04 02:26:10 +02:00
Vinay Dargar
55ba6ed14d Replace phrase 'the the' in 2 places
Replaced with the appropriate "in the" and "to the".
There are multiple other occurences of this but they are in upstream python modules.
2017-04-02 21:37:38 +05:30
Thomas Kriechbaumer
8567300dd6 Merge pull request #2224 from jonathanrfisher1/patch-1
Replace phrase "an TLS" with "a TLS"
2017-03-29 20:43:14 +02:00
jonathanrfisher1
677b9186ae Replace phrase "an TLS" with "a TLS"
Updating usage of 'a' or 'an' to be more consistent with general English grammatical rules. TLS is pronounced with a consonant sound for both the expanded and abbreviated versions.
2017-03-29 14:12:33 -04:00
Thomas Kriechbaumer
9e3edd16c2 Merge pull request #2223 from krsoninikhil/empty-logbuffer
Add tests for set_focus
2017-03-29 16:57:37 +02:00
Nikhil Soni
442999c75c Add tests for set_focus 2017-03-29 18:34:17 +05:30
Thomas Kriechbaumer
98957673f0 Merge pull request #2222 from BasThomas/patch-1
Add syntax highlighting to Readme
2017-03-29 13:53:06 +02:00
Bas Broek
dc37f7fd00 Add syntax highlighting to Readme 2017-03-29 13:36:51 +02:00
Thomas Kriechbaumer
034ca20d3c Merge pull request #2220 from krsoninikhil/empty-logbuffer
eventlog: Override set_focus to check for index error
2017-03-28 21:35:43 +02:00
Nikhil Soni
70a0f87570 Override set_focus to check for index error 2017-03-29 00:14:37 +05:30
Matthew Shao
15c19d2bc6 [web] Coverage++ for ducks/flows.js 2017-03-28 20:44:49 +08:00
Maximilian Hils
9c686ca14c Merge pull request #2212 from MatthewShao/jest-dev
[web] Update Jest config and coverage ++
2017-03-27 17:04:13 +02:00
Maximilian Hils
1909778e3d Merge pull request #2215 from mitmproxy/proxapp.local-1
proxapp -> proxapp.local
2017-03-26 16:09:26 +02:00
Matthew Shao
9a604b5cfe [web] Minor fixes for the test. 2017-03-26 21:58:47 +08:00
Matthew Shao
17ac1ff6cb [web] Exclude src/js/filt/filt.js from coverage report. 2017-03-26 21:38:00 +08:00
Maximilian Hils
482c1579f2 proxapp -> proxapp.local
Chrome doesn't like it if there's no TLD.
2017-03-26 15:02:47 +02:00
Matthew Shao
de9ecb7c30 [web] Add JS coverage to codecov. 2017-03-26 09:36:18 +08:00
Matthew Shao
31a45ddaaa [web] Reach 100% coverage for ducks/settings.js 2017-03-26 09:34:45 +08:00
Maximilian Hils
6f00c1bc5d Merge pull request #2213 from Kriechi/concurrent-coverage
fix #2201
2017-03-25 17:16:31 +01:00
Thomas Kriechbaumer
20776cf2fa fix #2201 2017-03-25 16:41:53 +01:00
Aldo Cortesi
c6a16e95e8 Merge pull request #2206 from cortesi/addonload
addon refactoring
2017-03-25 11:42:43 +13:00
Aldo Cortesi
65f0885bd6 addon loader: add boot_into, which replaces returning from start()
While we're here, expand test coverage for addonmanager to 100%, and promote to
individual coverage.
2017-03-25 10:48:12 +13:00
Aldo Cortesi
d69a411303 addons: an addonmanager.Loader is now passed to the load() event 2017-03-25 10:48:12 +13:00
Aldo Cortesi
541c1e8b9f addons: start -> load throughout 2017-03-25 10:48:12 +13:00
Maximilian Hils
b531353ee0 Merge pull request #2208 from Kriechi/bump-h2
bump h2
2017-03-24 21:43:14 +01:00
Thomas Kriechbaumer
ef9f0e22ea bump h2 2017-03-24 21:27:14 +01:00
Aldo Cortesi
335861f490 Merge pull request #2207 from nikofil/scripts-redirect-stdout
scripts: redirect stdout to ctx.log.warn
2017-03-24 14:58:33 +13:00
Nikos Filippakis
abf291b0f9 scripts: redirect stdout to ctx.log.warn
Redirect messages written to stdout in scripts to ctx.log.warn.
(closes #1530)

Signed-off-by: Nikos Filippakis <nikolaos.filippakis@cern.ch>
2017-03-24 02:35:02 +01:00
Aldo Cortesi
439c113989 Merge pull request #2204 from cortesi/consoleopts
console options: load and save options to and from file
2017-03-24 10:05:37 +13:00
Aldo Cortesi
0d3e2c6adf optmanager: make serialization and loading more robust
Also handle errors in console options manager.
2017-03-24 09:37:17 +13:00
Aldo Cortesi
3a35c69986 Merge pull request #2203 from cortesi/malformedreq
Don't pass malformed request objects to our error handler
2017-03-24 09:30:09 +13:00
Aldo Cortesi
eb66456d16 console: read and write options to and from file 2017-03-23 11:28:08 +13:00
Aldo Cortesi
44c3a24f8e Don't pass malformed request objects to our error handler 2017-03-23 10:48:20 +13:00
Maximilian Hils
1e81747a2a Merge pull request #2200 from MatthewShao/jest-dev
Add coverage for eventLog.js and update Jest config.
2017-03-22 15:12:07 +01:00
Thomas Kriechbaumer
907825714b move examples tests (#2199) 2017-03-22 12:02:18 +01:00
Matthew Shao
82ecc5448d [web] Update configuration of jest to avoid deprecation warning 2017-03-22 18:42:46 +08:00
Matthew Shao
20d3ae98e0 [web] Reach 100% coverage for ducks/eventLog.js 2017-03-22 18:32:28 +08:00
Matthew Shao
ab2fcbef8d Fix#2184, mitmweb -n dispalys incorrect message. (#2187)
* Fixes #2184, python code.

* Fixes #2184, JavaScript code.

* [web] Update Footer.jsx (#2184)
2017-03-22 01:26:04 +01:00
Aldo Cortesi
00902e6feb Merge pull request #2190 from cortesi/mypy
mypy for all of ./mitmproxy
2017-03-22 11:16:02 +13:00
Aldo Cortesi
95d9ec88ac tox: mypy checking for entire codebase
Also fix a few linting errors.
2017-03-22 10:29:46 +13:00
Maximilian Hils
26d0254203 [requires.io] dependency update on master branch (#2193) 2017-03-21 18:43:56 +01:00
Aldo Cortesi
cacad8373b Make tnetstrings pass mypy
Mypy doesn't support recursive types yet, so we can't properly express
TSerializable nested structures. For now, we just disable type checking in the
appropriate locations.

https://github.com/python/mypy/issues/731
2017-03-20 13:09:24 +13:00
Aldo Cortesi
4ca78604af Factor out an io module
Include tnetstring - we've made enough changes that this no longer belongs in
contrib.
2017-03-20 12:50:09 +13:00
Aldo Cortesi
0c0c0d38cc Merge pull request #2186 from cortesi/tighten
Tighten some options-related functionality
2017-03-20 12:39:49 +13:00
Aldo Cortesi
3a8da31835 mypy all of the codebase bar tnetstring
In some places, this involved removing type declarations where our types were
terminally confused. The grideditor specifically needs a cleanup and
restructure.
2017-03-20 12:38:08 +13:00
Aldo Cortesi
b98ce71770 Fix sliding window type specs 2017-03-20 12:21:50 +13:00
Aldo Cortesi
2153c337e0 addons.view: we no longer have a None order state
This bumps coverage back to 100% for addons
2017-03-20 09:12:09 +13:00
Maximilian Hils
92e3358915 Merge pull request #2182 from syahn/fix-#2134
Fix #2134
2017-03-19 14:47:50 +01:00
Aldo Cortesi
b19fefd173 options: default_contentview uses lowercase 2017-03-19 14:39:16 +13:00
Aldo Cortesi
04f1a4443d options: reraise when there's an error with a partial update
Fixes #2174
2017-03-19 14:36:23 +13:00
Aldo Cortesi
8d68196fd0 console: use overlay chooser to pick flow view mode
More consistent, and fixes status prompt overflow issue.
2017-03-19 14:13:54 +13:00
Aldo Cortesi
1b6bc874b5 console options: "d" to reset this option to default 2017-03-19 14:01:32 +13:00
Aldo Cortesi
43e7b9d68c options: choices for content views and tighten console order spec
Also clean up some leftovers from filter option name change.
2017-03-19 13:39:28 +13:00
Aldo Cortesi
22d7c31ea8 dumper, mitmdump: kill filtstr
Dumper now also uses view_filter. Passing arguments to mitmdump is now
precisely equivalent to setting both filestream_filter and view_filter.
2017-03-19 13:21:46 +13:00
Aldo Cortesi
475a4e3eb0 streamfile: add streamfile_filter and use it instead of filtstr 2017-03-19 13:08:26 +13:00
Aldo Cortesi
6f745b90a8 console options: fix bug that caused keys to be handled by wrong component 2017-03-19 13:05:27 +13:00
Aldo Cortesi
b1ef4bce30 options: filter -> view_filter 2017-03-19 13:02:51 +13:00
Aldo Cortesi
1b330ba453 Merge pull request #2173 from cortesi/coptions
Console options editor
2017-03-19 12:27:41 +13:00
Aldo Cortesi
cb18c91f13 console options: help for overlays, improved layout for overlay grid editor 2017-03-19 12:14:42 +13:00
Aldo Cortesi
21794c7bbe optmanager: test coverage 2017-03-19 11:29:41 +13:00
Aldo Cortesi
49b0a67eb9 console options: add an overlay grid editor for sequence options 2017-03-19 10:32:22 +13:00
Aldo Cortesi
4e24c95a61 optmanager: cope with bound methods in .subscribe
Fixes #2122
2017-03-19 10:32:22 +13:00
Aldo Cortesi
3f50d5fdbb console options: handle choices settings
This implements a generic chooser overlay, and uses it to handle setting
options that have fixed choices. We'll use this overlay elsewhere too.
2017-03-19 10:32:22 +13:00
Aldo Cortesi
fea08ef919 console options: in-place editor for simple strings and ints 2017-03-19 10:32:22 +13:00
Aldo Cortesi
3a949f35f8 console options: better defaults display, g/G/D keyboard shortcuts 2017-03-19 10:32:22 +13:00
Aldo Cortesi
d759150734 console: options/help pane switching, toggle bools with enter 2017-03-19 10:32:22 +13:00
Aldo Cortesi
8130b9880a options cleanups
- Move default client ciphers - no need to have a 10-line string default.
- Add some style guidance on options help
- Be consistent about reflowing options help
2017-03-19 10:32:22 +13:00
Aldo Cortesi
bc67cee687 console: sketch out look and feel of the new Options editor 2017-03-19 10:32:22 +13:00
Aldo Cortesi
39659c7528 Make mypy succeed with imports on master.py
We get little benefit from our mypy QA checks at the moment, because we skip
imports. This patch is what's needed to make mypy succeed with imports on a
single file: master.py

It also updates mypy to the current version, and enables a QA check.

Mypy bugs I encountered:

dict.update with kwargs not supported:

https://github.com/python/mypy/issues/1031

property setters and getters must be adjacent:

https://github.com/python/mypy/issues/1465
2017-03-19 10:32:22 +13:00
Frank Ahn
125ae2d8f9 Fix #2134 2017-03-18 09:51:05 +09:00
Maximilian Hils
c0882496e3 Merge pull request #2176 from MatthewShao/jest-dev
[web] Reach 100% coverage for ducks/utils/store.js
2017-03-17 15:15:55 +01:00
Matthew Shao
6c338c84a6 [web] Reach 100% coverage for ducks/utils/store.js 2017-03-17 17:17:11 +08:00
Aldo Cortesi
253fa95018 Merge pull request #2169 from cortesi/mypy
mypy without skipping imports
2017-03-17 09:43:29 +13:00
Aldo Cortesi
5192810ff6 Make mypy succeed with imports on master.py
We get little benefit from our mypy QA checks at the moment, because we skip
imports. This patch is what's needed to make mypy succeed with imports on a
single file: master.py

It also updates mypy to the current version, and enables a QA check.

Mypy bugs I encountered:

dict.update with kwargs not supported:

https://github.com/python/mypy/issues/1031

property setters and getters must be adjacent:

https://github.com/python/mypy/issues/1465
2017-03-17 08:13:47 +13:00
Aldo Cortesi
eac210829e Rip out master handler mechanism
All events are now handled by addons, and we no longer support any events on
master.
2017-03-17 07:56:32 +13:00
Maximilian Hils
7a8b362df6 Merge pull request #2156 from syahn/pretty-popup
[web] Adjust pop-up docs position
2017-03-16 18:36:39 +01:00
Maximilian Hils
6d66184eba Merge pull request #2168 from cortesi/handlers
Rip out old handlers mechanism - all events are now handled in addons
2017-03-16 14:57:26 +01:00
Maximilian Hils
50eeac482f Merge pull request #2153 from MatthewShao/jest-dev
[web] Reach 100% coverage for ducks/ui/flow.js
2017-03-16 14:51:33 +01:00
Maximilian Hils
6c0bbcc9ee Merge pull request #2170 from felixonmars/patch-1
Fix test_format_xml with dot in path
2017-03-16 12:34:01 +01:00
Thomas Kriechbaumer
4a354ffb83 Merge pull request #2172 from Kriechi/bump-h2
bump h2 and hyperframe
2017-03-16 11:21:31 +01:00
Thomas Kriechbaumer
2a934ed323 bump h2 and hyperframe 2017-03-16 10:33:36 +01:00
Felix Yan
b352557092 Fix test_format_xml with dot in path
When the path contains dot ".", replacing all dots will generate a non-exist result and raises a FileNotFoundError. Replacing only the last dot fixes this.
2017-03-16 16:48:21 +08:00
Aldo Cortesi
c5e0dc64b9 Rip out master handler mechanism
All events are now handled by addons, and we no longer support any events on
master.
2017-03-16 18:33:24 +13:00
Aldo Cortesi
3de9829003 Remove last handlers in test suite
While we're here, make checking that there are NO logs on the proxy more
efficient by setting a short timeout. This reduces total test suite running
time by 15% on my system.
2017-03-16 18:05:57 +13:00
Aldo Cortesi
1410cbb4b6 Remove test handlers by using taddons.RecordingMaster
This also means expanding and tweaking the recording master API, which we
reflect through the current test suite
2017-03-16 18:05:57 +13:00
Aldo Cortesi
85ddc5056b Move mitmproxy console handlers to addon 2017-03-16 18:05:57 +13:00
Aldo Cortesi
b6abe96202 Move dump handlers to addon 2017-03-16 18:05:57 +13:00
Aldo Cortesi
b275257553 Merge pull request #2167 from mhils/update-dependencies
update mypy
2017-03-16 18:05:35 +13:00
Aldo Cortesi
fefbff4164 Merge pull request #2166 from Kriechi/bump-sphinx
bump sphinx
2017-03-16 15:56:11 +13:00
Aldo Cortesi
1e6c51c698 Merge pull request #2152 from cortesi/readfile
Extract flow reading into addons
2017-03-16 15:51:21 +13:00
Maximilian Hils
112233891d update mypy 2017-03-15 23:10:21 +01:00
Aldo Cortesi
228a22b3c0 Add a light-weight custom event system, use it for keepserving
This patch implements the lightweight event system I propose in #2144, adds a
custom event "processing_complete" that is triggered after file read, client
replay and server replay, and introduces a KeepServing addon to handle this for
mitmdump.
2017-03-16 11:07:00 +13:00
Thomas Kriechbaumer
bbf4629906 bump sphinx 2017-03-15 23:05:06 +01:00
Thomas Kriechbaumer
dadefaecdb update release instructions 2017-03-15 22:36:58 +01:00
Aldo Cortesi
169068c7ec Clean up addonmanager interface
Clarify the plethora of invocation methods we've sprouted, correct some usages
in the codebase.
2017-03-16 08:50:29 +13:00
Thomas Kriechbaumer
e755399d1e update release instructions 2017-03-15 20:06:39 +01:00
Thomas Kriechbaumer
cd93dabd75 Merge pull request #2161 from mhils/make-request
Add Request.make
2017-03-15 14:40:42 +01:00
Maximilian Hils
65c4a3cf1d add Request.make 2017-03-15 14:06:00 +01:00
Thomas Kriechbaumer
ec5d9cbd2b Merge pull request #2159 from Kriechi/bump-pyparsing
bump pyparsing
2017-03-15 11:45:47 +01:00
Thomas Kriechbaumer
56b087278a Merge pull request #2155 from Kriechi/bump-cryptography
bump cryptography
2017-03-15 10:43:48 +01:00
Thomas Kriechbaumer
aa4ab88459 bump pyparsing 2017-03-15 10:14:16 +01:00
Thomas Kriechbaumer
0316a13f7e bump cryptography 2017-03-15 10:03:18 +01:00
Frank Ahn
9f8004ed62 Adjust pop-up docs position
- Shift pop-up docs to the right position according to responsiveness
2017-03-15 17:44:52 +09:00
Aldo Cortesi
ef582333ff Extract flow reading into addons
This patch moves the final pieces of master functionality into addons.

- Add a ReadFile addon to read from file
- Add a separate ReadStdin addon to read from stdin, only used by mitmdump
- Remove all methods that know about io and serialization from master.Master
2017-03-15 13:44:48 +13:00
Matthew Shao
48fe32766c [web] Reach 100% coverage for ducks/ui/flow.js 2017-03-15 08:27:42 +08:00
Aldo Cortesi
eba6d4359c Merge pull request #2098 from iharsh234/bug-2086-2
Fix crash when duplicating flow
2017-03-15 09:20:33 +13:00
Aldo Cortesi
6e7ba84017 Merge pull request #2147 from mhils/connection-ids
Add client/server connection ids
2017-03-15 09:20:16 +13:00
Maximilian Hils
30797755fb stateobject: automatically change id when copying 2017-03-14 17:08:40 +01:00
Maximilian Hils
375680a3be add connection ids 2017-03-14 17:08:40 +01:00
Maximilian Hils
e29cd7f5b7 Merge pull request #2151 from MatthewShao/jest-dev
Update .travis.yml to fix the failure about yarn
2017-03-14 17:07:11 +01:00
Matthew Shao
b37d935d3c Update .travis.yml to fix the failure about yarn 2017-03-14 23:09:25 +08:00
Maximilian Hils
2d016775fe Merge pull request #2149 from mhils/multidictview-copy
Add MultiDictView.copy()
2017-03-14 14:38:38 +01:00
Maximilian Hils
0de8f46c6e Merge pull request #2145 from mhils/ne
py3: __ne__ delegates to __eq__ by default
2017-03-14 14:38:00 +01:00
Maximilian Hils
6bd2eb4ec6 Merge pull request #2150 from dequis/pretty-host-filter
Match ~d and ~u filters against pretty_host too
2017-03-14 14:31:12 +01:00
dequis
f351d0a307 Match ~d and ~u filters against pretty_host too
Changed the ~u filter in the console UI to match the behavior of
mitmweb, which only matches against pretty_url, never against url.
2017-03-14 01:57:22 -03:00
Aldo Cortesi
1f37743549 Merge pull request #2142 from mhils/2140
Fix #2140
2017-03-14 14:24:53 +13:00
Maximilian Hils
961747c4bc Merge pull request #2120 from ujjwal96/replace-from-file
Replace from file through replacement editor #1458
2017-03-14 02:23:31 +01:00
Maximilian Hils
30fd285ec7 multidictviews should not be serializable 2017-03-14 02:20:24 +01:00
Maximilian Hils
de1de3f626 add MultiDictView.copy 2017-03-14 02:18:02 +01:00
Aldo Cortesi
124a6c9e5a Merge pull request #2129 from cortesi/addonopts
Custom options for addons
2017-03-14 13:02:58 +13:00
Maximilian Hils
875ce8c9c1 py3: __ne__ delegates to __eq__ by default 2017-03-14 00:40:15 +01:00
Aldo Cortesi
1b301ad5bb Move running() in to .tick() method to make sure it's called consistently 2017-03-14 11:41:20 +13:00
Aldo Cortesi
b745428b5c Enable custom options in config files
We also now ignore unknown options in config files by default, and print a
warning if verbosity is incremented.
2017-03-14 09:40:33 +13:00
Aldo Cortesi
2832e790fd Move serialization methods out of Options
Attributes on options share a namespace with options themselves. It's getting
too crowded on our Options object, so let's shift some obvious stuff into the
module.
2017-03-14 08:34:25 +13:00
Aldo Cortesi
c24f7d8e12 Optmanager: handle unknown options in value sets 2017-03-14 08:32:19 +13:00
Aldo Cortesi
75f83ba380 Show custom addon options in --options output 2017-03-14 08:32:19 +13:00
Aldo Cortesi
0c6663d0d5 Enable custom options for addons
- Add an options parameter to the start() event. This is to be used by addons
on startup to add custom options.
- Add a running() event that is called once the proxy is up and running.
- With the new paradigm we can't log during master __init__, so add a tiny
termstatus addon to print proxy status to terminal once we're running.
2017-03-14 08:32:19 +13:00
Maximilian Hils
2c6dcac97f fix #2140 2017-03-13 18:37:11 +01:00
Maximilian Hils
ee65894d40 Merge pull request #2141 from mhils/decouple-file-format-version
Decouple mitmproxy and file format version
2017-03-13 18:32:48 +01:00
Maximilian Hils
c853091a85 Merge pull request #2137 from MatthewShao/jest-dev
[web] Add tests for ducks/utils/store.js
2017-03-13 16:55:57 +01:00
Maximilian Hils
517aef1557 decouple mitmproxy and file format version 2017-03-13 16:46:07 +01:00
Matthew Shao
9ce92b1f43 [web] Improve the code style on tests 2017-03-13 22:47:44 +08:00
Maximilian Hils
647d7601b2 Merge pull request #2138 from MatthewShao/fix#2133
[web] Fixes the flaw in #2133
2017-03-13 14:58:15 +01:00
Matthew Shao
8a734a2613 [web] Fixes the flaw in #2133 2017-03-13 19:34:25 +08:00
Matthew Shao
6d37812928 [web] Add tests for ducks/utils/store.js 2017-03-13 15:56:14 +08:00
Maximilian Hils
05e11547f5 Merge remote-tracking branch 'origin/master' into pr-2120
Conflicts:
	test/mitmproxy/addons/test_replace.py
2017-03-12 23:33:49 +01:00
Maximilian Hils
0f4b523868 replacements: error right away when file does not exist 2017-03-12 23:28:49 +01:00
Thomas Kriechbaumer
7d5ab70ce3 Merge pull request #2136 from Kriechi/nuke-custom-tmpdir
nuke tutils.tmpdir, use pytest tmpdir
2017-03-12 23:27:27 +01:00
Thomas Kriechbaumer
1b045d24bc nuke tutils.tmpdir, use pytest tmpdir 2017-03-12 22:55:22 +01:00
harsh vijay
a8f7c30aa0 Fix crash when duplicating flow 2017-03-13 02:23:52 +05:30
harsh vijay
9a88594684 Fix crash when duplicating flow (#2) 2017-03-13 01:55:50 +05:30
Maximilian Hils
4684617d2c minor fixes 2017-03-12 20:24:50 +01:00
Ujjwal Verma
7f5fc0fdbd Removed ReplaceBase 2017-03-12 22:10:49 +05:30
Maximilian Hils
d069ba9da5 Merge pull request #2133 from MatthewShao/jest-dev
[web] Fixes all existed tests
2017-03-11 14:48:25 +01:00
Maximilian Hils
74f2a28e41 minor fixes 2017-03-11 14:46:33 +01:00
Matthew Shao
abcfac3c19 [web] remove node tests on travis 2017-03-11 20:21:03 +08:00
Matthew Shao
523cbb02aa [web] Fixes all existed tests 2017-03-11 18:57:32 +08:00
Matthew Shao
c4e0ea27a1 [web] Fixes stopEdit action in ducks/ui/flow.js 2017-03-11 17:43:55 +08:00
Ujjwal Verma
e0382fde2f Changed docs 2017-03-11 11:14:22 +05:30
Maximilian Hils
39154e628e Merge pull request #2118 from lymanZerga11/patch-3
Update proxyauth.py to add proxyauth metadata
2017-03-11 03:04:38 +01:00
Thomas Kriechbaumer
e9746c5182 Merge pull request #2114 from mitmproxy/fix-websocket-serialization
make websocket flows serializable
2017-03-10 21:15:46 +01:00
Thomas Kriechbaumer
49e0f23848 use flow.id to link websocket + handshake flows 2017-03-10 19:38:50 +01:00
Thomas Kriechbaumer
c39b65c06b Merge pull request #2128 from Kriechi/disable-more-h2c
disable h2c prior knowledge connections
2017-03-10 19:30:13 +01:00
Maximilian Hils
f3df4855af fix type hints 2017-03-10 19:15:44 +01:00
Maximilian Hils
db40bdca15 Merge pull request #2131 from MatthewShao/jest-dev
[web] remove the tests for abandoned files.
2017-03-10 18:21:55 +01:00
Thomas Kriechbaumer
45bf1ff64d disable h2c prior knowledge connections 2017-03-10 11:37:25 +01:00
Matthew Shao
da93525d2e [web] remove the tests for abandoned files. 2017-03-10 08:52:11 +08:00
Thomas Kriechbaumer
b1dd86d7ae make websocket flows serializable
fixes #2113
2017-03-09 19:08:59 +01:00
Ujjwal Verma
25703f8c53 Fixed Replacement test 2017-03-09 23:33:11 +05:30
Ujjwal Verma
07a8c4987a Removed replace-from-file option.Made requested changes 2017-03-09 23:20:35 +05:30
lymanZerga11
8e67c6f512 add tests 2017-03-09 09:01:37 +08:00
lymanZerga11
6d53d08b6c Add tests 2017-03-09 09:01:04 +08:00
Aldo Cortesi
98b5893855 Merge pull request #2100 from cortesi/options
Options revamp
2017-03-09 12:27:36 +13:00
Aldo Cortesi
53178f35be Send change event when options are reset 2017-03-09 08:53:36 +13:00
Aldo Cortesi
7ea171de4e Merge remote-tracking branch 'upstream/master' into options 2017-03-09 08:33:25 +13:00
Maximilian Hils
44c3c3ed86 Merge pull request #2124 from mitmproxy/opensuse-docs
Add install docs for openSUSE
2017-03-08 20:06:35 +01:00
Maximilian Hils
7bec0c73cf Merge pull request #2121 from mitmproxy/address-leftovers
fix tcp.Address leftovers
2017-03-08 20:06:11 +01:00
Thomas Kriechbaumer
69f7ad21fe Merge pull request #2090 from mitmproxy/fix-pathod-header-doc
pathoc: fix documentation for headers
2017-03-08 18:21:56 +01:00
Maximilian Hils
ca819fb923 add install docs for openSUSE 2017-03-08 17:42:19 +01:00
Maximilian Hils
ebedd2875f Merge pull request #2117 from MatthewShao/jest-dev
Add Javascript tests to Travis CI. (#2093)
2017-03-08 16:54:02 +01:00
Maximilian Hils
927b5707fe fix tcp.Address leftovers
this fixes the issue described in https://github.com/mitmproxy/mitmproxy/issues/2119#issuecomment-285067292
2017-03-08 16:18:34 +01:00
Maximilian Hils
b345f5d432 typing.Type -> type
`typing.Type` should have been `typing.Type[Any]`,
which in turn is equivalent to `type`
2017-03-08 15:17:45 +01:00
Maximilian Hils
63179d9751 core -> core_option_validation
longer, but much clearer for devs who are unfamiliar with the codebase.
2017-03-08 15:17:07 +01:00
Maximilian Hils
618e0c6265 add type hints 2017-03-08 15:10:06 +01:00
lymanZerga11
716306538f Update proxyauth.py 2017-03-08 13:59:43 +08:00
Matthew Shao
dae7435135 Update .travis.yml 2017-03-08 13:00:09 +08:00
lymanZerga11
cf83cbf2df Update proxyauth.py 2017-03-08 11:34:59 +08:00
Matthew Shao
7ff48b8bec Add javascript tests to Travis CI. (#2093) 2017-03-08 09:57:38 +08:00
Maximilian Hils
f276c7a80d change add_option parameter order
name, type, value matches natural language ($x is a bool with a default value of $y)
and also matches the python type annotation order of name: type = value
2017-03-08 00:52:37 +01:00
Maximilian Hils
8707928b16 unify server spec parsing 2017-03-08 00:18:34 +01:00
Aldo Cortesi
f0d6237a96 Merge remote-tracking branch 'upstream/master' into options 2017-03-08 08:22:49 +13:00
Maximilian Hils
cb6240974d Merge pull request #2108 from ayamamori/patch-1
Update tcpproxy.rst
2017-03-07 14:53:27 +01:00
Maximilian Hils
88d48e38af Revert "clearly indentify git-checkout as dev version"
This temporarily reverts commit 3b4d8d6eca,
which broke the snapshot builds.
2017-03-07 14:49:52 +01:00
Maximilian Hils
bda39335b0 fix other occurences of options shortcut 2017-03-07 12:16:23 +01:00
Maximilian Hils
96b798615c Merge pull request #2106 from davidpshaw/python_3_to_master
Updated for python 3
2017-03-07 12:12:47 +01:00
Aldo Cortesi
98ec3b77fe Carefully re-add a small number of short flags
The mechanism for booleans attaches the short flag to whatever the opposite of
the default is.
2017-03-07 21:01:21 +13:00
Aldo Cortesi
aed780bf48 Change the way proxy authetication is specified
We now have one option "proxyauth". If this is "any", we accept any
credentials, if it starts with an @ it's treated as a path to an htpasswd file,
if it is of the form username:password it's a single-user credential.
2017-03-07 20:18:30 +13:00
Aldo Cortesi
b0ba765598 Add type and choices to options dump commets. 2017-03-07 19:29:08 +13:00
Yamamori Akihiro
7ca2913c1a Update tcpproxy.rst
Correction in key short cut for TCP Proxy
2017-03-07 14:15:29 +09:00
Aldo Cortesi
320d8848ab Options tweaks
- Regularise presentation and wording
- Help is mandatory for all options
- Auto-generate wording to say that sequence options can be passed multiple
times on the command-line
2017-03-07 15:23:46 +13:00
Aldo Cortesi
10db254791 Remove less commonly used command-line options
We now have --set, so only options that really deserve it get a dedicated flag.
I'm inclined to strip this back even more. Feel free to argue that YOUR
favourite option deserves special treatment here.
2017-03-07 14:46:28 +13:00
Aldo Cortesi
ac3b0d69cc Add the --set option to set options directly
The --set option is a universal flag for setting options. Some examples:

Turn on a boolean:

mitmdump --set onboarding=false

Add a value to a sequence:

mitumdupm --set setheaders=/foo/bar/voing

Zero a sequence:

mitumdupm --set setheaders
2017-03-07 14:27:50 +13:00
Aldo Cortesi
79f5883c2f Option spacing and coverage 2017-03-07 13:47:39 +13:00
Aldo Cortesi
99a6b0dbc1 Add --options that dumps annotated option defaults 2017-03-07 13:16:28 +13:00
Maximilian Hils
6a4ca68a1c Merge pull request #2101 from s4chin/add-websocket-filter-to-web
web: Add websocket filter
2017-03-06 23:34:13 +01:00
Maximilian Hils
193c707061 Merge pull request #2104 from lymanZerga11/patch-2
Fixes #2102
2017-03-06 23:33:33 +01:00
Maximilian Hils
8b79892246 Merge pull request #2105 from Kriechi/dev-version
clearly indentify git-checkout as dev version
2017-03-06 23:32:59 +01:00
David Shaw
5d82068218 Updated for python 3 2017-03-06 17:30:40 -05:00
Aldo Cortesi
d13df40753 Fix lint 2017-03-07 10:52:28 +13:00
Aldo Cortesi
3d9c2233be Flatten commandline arg structure, extract common run func for tools 2017-03-07 10:51:24 +13:00
Aldo Cortesi
b51df9a0b1 Cleanups and test coverage 2017-03-07 10:28:15 +13:00
Aldo Cortesi
2312cf6fb0 Automate slurping up command-line options
Now that options are completely regular, we can automate reading them out of
arguments, rather than listing them out by hand like savages.
2017-03-07 08:49:19 +13:00
Thomas Kriechbaumer
3b4d8d6eca clearly indentify git-checkout as dev version 2017-03-06 18:40:43 +01:00
lymanZerga11
fd5079b8a6 Update flowlist.py 2017-03-06 23:19:39 +08:00
Aldo Cortesi
edfd62e42a Replacements and setheaders are always strings
Instead of having two representations we have one canonical specification.
Fixing the editor in console is left ot a further patch.
2017-03-06 20:58:51 +13:00
Aldo Cortesi
82163a1e68 Unify mode specification
We now have:

    --mode regular (the default)
    --mode transparent
    --mode socks5
    --mode reverse:SPEC
    --mode upstream:SPEC

Where SPEC is a host specification.
2017-03-06 20:30:49 +13:00
Sachin Kelkar
fa96a1d1ef web: Add websocket filter 2017-03-06 10:20:11 +05:30
Aldo Cortesi
f5fb6972aa Options unification: certs
- Regularise to Sequence[str]
- Move conversion and checking into proxy config object
2017-03-06 15:36:52 +13:00
Aldo Cortesi
9b1f40da37 Options unification: streamfile
We now have one option to control this. If the path is prefixed with a "+" we
append, otherwise we overwrite.
2017-03-06 13:48:37 +13:00
Aldo Cortesi
71a830c836 Options unification: string choices 2017-03-06 13:20:17 +13:00
Aldo Cortesi
e70b46672c Options unification: simple string sequences 2017-03-06 11:56:11 +13:00
Aldo Cortesi
18a6b66ba4 Options unification: simple strings
Move all simple string options to the new scheme. Also regularise some names.
2017-03-06 11:39:19 +13:00
Aldo Cortesi
201c65960e Options unification: sizes
Start dealing with corner cases:

- Sizes are always stored in options as strings
- Add a new core addon that's responsible for verifying settings that don't
belong to an addon
- Add a _processed scratch space on the Options object for processed core
values to be stored in. This is pretty dirty, but less dirty than re-parsing
values every time. We'll come up with something better down the track.
2017-03-06 09:07:16 +13:00
Maximilian Hils
22154dee5c Merge pull request #2085 from MatthewShao/dev
[web] Add mouse support for FilterInput popover.
2017-03-05 13:07:54 +01:00
Aldo Cortesi
45d18ac8cb Start unifying options and the command-line: ints
Moves all integer options apart from a few tricky ones like verbosity over to
auto generation. Also add a metavar argument to parser generation to support
this.
2017-03-05 20:45:55 +13:00
Aldo Cortesi
f15a628561 Start unifying options and the command-line: booleans
This commit:

- Adds a help field to options
- Adds a function to generate parser definitions from options
- Uses this to migrate all boolean flags over to the new system
- Makes all booleans consistently follow the --foo/--not-foo convention

There are a number of things left to be done here:

- Argparse doesn't give us a nice way to format --foo --not-foo help. Click
does, and moving to click is a goal down the track.
- For now, we remove all short aliases. I want to re-evaluate these
systematically once we have the new structure in place.
2017-03-05 20:16:33 +13:00
Aldo Cortesi
67381ae550 Revamp options
- Options are now explicitly initialized with an add_option method
- We have one canonical Options class - ditch dump.Options
2017-03-05 15:10:03 +13:00
Clemens Brunner
e0644398b6 Merge pull request #2095 from MatthewShao/fix-#2083
[web] fix scroll bug in flow detail. (#2083)
2017-03-04 17:04:44 +01:00
Matthew Shao
6a3497f09c [web] Retain focus on input field after selecting. (#2085) 2017-03-04 22:44:13 +08:00
Maximilian Hils
2a1ed7f3eb Merge pull request #2082 from tarnacious/master
Call error handler and mark flow on HTTPException
2017-03-04 15:12:29 +01:00
Matthew Shao
9f348122ec [web] fix scroll bug in flow detail. (#2083) 2017-03-04 21:14:46 +08:00
Maximilian Hils
1bafe73a94 update release docs 2017-03-04 11:50:40 +01:00
Maximilian Hils
f71c11559c Merge pull request #2081 from s4chin/add-pretty-host
web: Fix #1888
2017-03-04 11:48:05 +01:00
Maximilian Hils
78fd5a9dad Merge pull request #2053 from krsoninikhil/on-issues
Adds --keep-host-header option (#2039)
2017-03-04 11:42:44 +01:00
Maximilian Hils
50ebdf3081 update docs wording 2017-03-04 11:42:30 +01:00
Sachin Kelkar
40943f5618 web: Fix #1888 2017-03-03 16:14:20 +05:30
Nikhil Soni
0081d9b828 Merge branch 'master' into on-issues 2017-03-03 12:58:44 +05:30
Nikhil Soni
3da8532bed Adds test for --keep-host-header 2017-03-03 12:34:36 +05:30
Nikhil Soni
317d183ba4 Changes dns_spoofing example to use --keep-host-header 2017-03-03 12:34:36 +05:30
Nikhil Soni
2e90373e4b Updates help msg and docs for --keep-host-header 2017-03-03 12:34:20 +05:30
Matthew Shao
6479bd4e76 [web] Simplify the code. (#2085) 2017-03-03 09:16:26 +08:00
Maximilian Hils
f1417dd859 pathoc: fix documentation for headers 2017-03-02 19:24:48 +01:00
Maximilian Hils
bae4cdf8d5 docs: fix cert install link 2017-03-02 19:10:11 +01:00
Maximilian Hils
b6eda9ed0a docs: http -> https where possible 2017-03-02 19:09:14 +01:00
Maximilian Hils
182c1adf7d Merge pull request #2087 from TomTasche/patch-1
link to stackoverflow for chrome certificate management
2017-03-02 19:04:36 +01:00
Thomas Taschauer
fc9b152d06 link to stackoverflow for chrome certificate management 2017-03-02 14:31:41 +01:00
Matthew Shao
c397a21980 [web] Add mouse support for FilterInput popover. 2017-03-02 11:09:13 +08:00
tarnacious
fc3848345b Call error handler and mark flow on HTTPException
This allows scripts to handle HTTPExceptions such as "HTTP Body too
large" raised in mitmproxy/net/http/http1/read.py:131
2017-03-01 16:35:54 +01:00
Thomas Kriechbaumer
a505221e1a http2: fix h2 header normalization test (#2080) 2017-03-01 12:46:42 +01:00
David Dworken
99b584ad7d added XSS scanner example (#1907) 2017-02-27 18:22:39 +01:00
Thomas Kriechbaumer
8f13f357ad Merge pull request #2072 from MatthewShao/dev
[web] Fix #2071, remove client_connect.mitmcert when dumping flow into json.
2017-02-27 17:10:17 +01:00
Matthew Shao
7b4e60a2e7 Fix #2071, remove client_connect.mitmcert when dumping flow into json. 2017-02-27 21:45:26 +08:00
Maximilian Hils
2b3093fa1c Merge pull request #2069 from JordanLoehr/iss2068
Added console_ prefix to options in palettepicker to resolve issue #2068
2017-02-27 05:37:26 +01:00
Jordan Loehr
6fd9146d0d Added console_ prefix to options in palettepicket to resolve issue #2068 2017-02-26 17:08:00 -07:00
Maximilian Hils
0fdf2c0f4b Merge pull request #2067 from whackashoe/fix-2065
fix #2065 by setting correct content encoding for brotli
2017-02-26 22:10:12 +01:00
Jett
18a8834209 fix #2065 by setting correct content encoding for brotli 2017-02-26 14:49:50 -06:00
Thomas Kriechbaumer
19b2208c27 Merge pull request #2066 from Kriechi/certs-tests
certs: coverage++
2017-02-26 20:55:56 +01:00
Thomas Kriechbaumer
e0bc1109c0 do not return IP addresses in SSLCert.altnames 2017-02-26 20:54:06 +01:00
Thomas Kriechbaumer
6d228dfb9f certs: coverage++ 2017-02-26 20:54:04 +01:00
Thomas Kriechbaumer
9b6986ea87 connections: coverage++ (#2064) 2017-02-26 20:50:52 +01:00
Maximilian Hils
b33d568e04 Merge pull request #2063 from ujjwal96/hardump-support
Remove pytz in hardump
2017-02-26 20:00:02 +01:00
Ujjwal Verma
e723a58af5 Remove pytz in hardump 2017-02-25 23:07:47 +05:30
Thomas Kriechbaumer
2aecffd39a optmanager: coverage++ (#2062) 2017-02-25 12:12:43 +01:00
Thomas Kriechbaumer
ccd8e1e617 http: coverage++ (#2060) 2017-02-24 22:39:16 +01:00
Thomas Kriechbaumer
3c66eb5b4b move tests around (#2059) 2017-02-24 22:00:41 +01:00
Thomas Kriechbaumer
e89c785719 move tests around (#2058) 2017-02-24 21:39:13 +01:00
Thomas Kriechbaumer
6d9560df1d Merge pull request #2057 from Kriechi/flowfilter-coverage
flowfilter: coverage++
2017-02-24 20:57:54 +01:00
Thomas Kriechbaumer
d17309eda8 flowfilter: coverage++ 2017-02-24 18:47:56 +01:00
Thomas Kriechbaumer
83f1e2eec0 Merge pull request #2050 from Kriechi/create_connection
nuke tcp.Address and add proper IPv6 support
2017-02-24 15:42:52 +01:00
Maximilian Hils
fab3a8dcf4 fix constant definition 2017-02-24 13:39:03 +01:00
Thomas Kriechbaumer
3e9125a3c1 nuke tcp.Address and add proper IPv6 support 2017-02-24 13:39:02 +01:00
Thomas Kriechbaumer
d0d11cec7b Merge pull request #2055 from Kriechi/http2-normalize-headers
http2: normalize headers before sending
2017-02-23 16:04:11 +01:00
Thomas Kriechbaumer
bbe08bb8d9 http2: normalize headers before sending 2017-02-23 12:53:40 +01:00
Nikhil Soni
14d8c91250 Adds --keep-host-header option (#2039) 2017-02-22 21:49:31 +05:30
Maximilian Hils
ba76dbc672 Merge pull request #2048 from ujjwal96/ipv6-addresses
Absolute IPv6 addresses supported
2017-02-22 13:17:53 +01:00
Thomas Kriechbaumer
391f28f78c Merge pull request #2018 from dlenski/save_mitm_cert 2017-02-21 21:08:28 +01:00
Thomas Kriechbaumer
786cd214d6 update compat 2017-02-21 20:57:54 +01:00
Maximilian Hils
6b2383a9d8 minor adjustments 2017-02-21 20:57:29 +01:00
Daniel Lenski
48b3d1af2f store generated cert for each flow
fixes #1935
2017-02-21 20:57:29 +01:00
Thomas Kriechbaumer
cf0bce77a9 lint! 2017-02-21 19:40:06 +01:00
Thomas Kriechbaumer
0b1cdc382c bump dev version 2017-02-21 19:17:33 +01:00
Maximilian Hils
aa6b0f299e minor changes 2017-02-21 13:00:04 +01:00
Ujjwal Verma
11134b669e Absolute IPv6 addresses supported 2017-02-20 21:02:43 +05:30
366 changed files with 31892 additions and 25165 deletions

2
.gitignore vendored
View File

@@ -11,6 +11,7 @@ MANIFEST
.cache/
.tox*/
build/
mitmproxy/contrib/kaitaistruct/*.ksy
# UI
@@ -21,3 +22,4 @@ sslkeylogfile.log
.tox/
.python-version
coverage.xml
web/coverage/

View File

@@ -47,6 +47,19 @@ matrix:
env: TOXENV=individual_coverage
- python: 3.5
env: TOXENV=docs
- language: node_js
node_js: "node"
before_install:
- curl -o- -L https://yarnpkg.com/install.sh | bash
- export PATH=$HOME/.yarn/bin:$PATH
install:
- cd web && yarn
- yarn global add codecov
script: npm test && codecov
cache:
yarn: true
directories:
- web/node_modules
install:
- |

View File

@@ -1,3 +1,19 @@
28 April 2017: mitmproxy 2.0.2
* Fix mitmweb's Content-Security-Policy to work with Chrome 58+
* HTTP/2: actually use header normalization from hyper-h2
15 March 2017: mitmproxy 2.0.1
* bump cryptography dependency
* bump pyparsing dependency
* HTTP/2: use header normalization from hyper-h2
21 February 2017: mitmproxy 2.0
* HTTP/2 is now enabled by default.

View File

@@ -62,7 +62,7 @@ Development Setup
To get started hacking on mitmproxy, please follow the `advanced installation`_ steps to install mitmproxy from source, but stop right before running ``pip3 install mitmproxy``. Instead, do the following:
.. code-block:: text
.. code-block:: bash
git clone https://github.com/mitmproxy/mitmproxy.git
cd mitmproxy
@@ -80,7 +80,7 @@ The main executables for the project - ``mitmdump``, ``mitmproxy``,
virtualenv. After activating the virtualenv, they will be on your $PATH, and
you can run them like any other command:
.. code-block:: text
.. code-block:: bash
. venv/bin/activate # "venv\Scripts\activate" on Windows
mitmdump --version
@@ -91,13 +91,13 @@ Testing
If you've followed the procedure above, you already have all the development
requirements installed, and you can run the full test suite (including tests for code style and documentation) with tox_:
.. code-block:: text
.. code-block:: bash
tox
For speedier testing, we recommend you run `pytest`_ directly on individual test files or folders:
.. code-block:: text
.. code-block:: bash
cd test/mitmproxy/addons
pytest --cov mitmproxy.addons.anticache --looponfail test_anticache.py
@@ -114,7 +114,7 @@ The mitmproxy documentation is build using Sphinx_, which is installed
automatically if you set up a development environment as described above. After
installation, you can render the documentation like this:
.. code-block:: text
.. code-block:: bash
cd docs
make clean
@@ -136,7 +136,7 @@ This is automatically enforced on every PR. If we detect a linting error, the
PR checks will fail and block merging. You can run our lint checks yourself
with the following command:
.. code-block:: text
.. code-block:: bash
tox -e lint

View File

@@ -24,6 +24,9 @@ something like this:
Click on the relevant icon, follow the setup instructions for the platform
you're on and you are good to go.
For iOS version 10.3 or up, you need to make sure ``mitmproxy`` is enabled in
``Certificate Trust Settings``, you can check it by going to
``Settings > General > About > Certificate Trust Settings``.
Installing the mitmproxy CA certificate manually
------------------------------------------------
@@ -42,7 +45,7 @@ iOS
See http://jasdev.me/intercepting-ios-traffic
and http://web.archive.org/web/20150920082614/http://kb.mit.edu/confluence/pages/viewpage.action?pageId=152600377
and https://web.archive.org/web/20150920082614/http://kb.mit.edu/confluence/pages/viewpage.action?pageId=152600377
iOS Simulator
^^^^^^^^^^^^^
@@ -52,7 +55,7 @@ See https://github.com/ADVTOOLS/ADVTrustStore#how-to-use-advtruststore
Java
^^^^
See http://docs.oracle.com/cd/E19906-01/820-4916/geygn/index.html
See https://docs.oracle.com/cd/E19906-01/820-4916/geygn/index.html
Android/Android Simulator
^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -62,7 +65,7 @@ See http://wiki.cacert.org/FAQ/ImportRootCert#Android_Phones_.26_Tablets
Windows
^^^^^^^
See http://windows.microsoft.com/en-ca/windows/import-export-certificates-private-keys#1TC=windows-7
See https://web.archive.org/web/20160612045445/http://windows.microsoft.com/en-ca/windows/import-export-certificates-private-keys#1TC=windows-7
Windows (automated)
^^^^^^^^^^^^^^^^^^^
@@ -79,7 +82,7 @@ See https://support.apple.com/kb/PH7297?locale=en_US
Ubuntu/Debian
^^^^^^^^^^^^^
See http://askubuntu.com/questions/73287/how-do-i-install-a-root-certificate/94861#94861
See https://askubuntu.com/questions/73287/how-do-i-install-a-root-certificate/94861#94861
Mozilla Firefox
^^^^^^^^^^^^^^^
@@ -89,7 +92,7 @@ See https://wiki.mozilla.org/MozillaRootCertificate#Mozilla_Firefox
Chrome on Linux
^^^^^^^^^^^^^^^
See https://code.google.com/p/chromium/wiki/LinuxCertManagement
See https://stackoverflow.com/a/15076602/198996
The mitmproxy certificate authority
@@ -132,7 +135,7 @@ mitmproxy-ca-cert.cer Same file as .pem, but with an extension expected by some
Using a custom certificate
--------------------------
You can use your own certificate by passing the ``--cert [domain=]path_to_certificate`` option to
You can use your own (leaf) certificate by passing the ``--cert [domain=]path_to_certificate`` option to
mitmproxy. Mitmproxy then uses the provided certificate for interception of the
specified domain instead of generating a certificate signed by its own CA.
@@ -205,4 +208,4 @@ directory and uses this as the client cert.
.. _Certificate Pinning: http://security.stackexchange.com/questions/29988/what-is-certificate-pinning/
.. _Certificate Pinning: https://security.stackexchange.com/questions/29988/what-is-certificate-pinning/

View File

@@ -11,5 +11,5 @@ sure you capture an HTTP exchange in its totality. It's also often used during
================== ======================
command-line ``--anticache``
mitmproxy shortcut :kbd:`o` then :kbd:`a`
mitmproxy shortcut :kbd:`O` then :kbd:`a`
================== ======================

View File

@@ -23,7 +23,7 @@ How it works
================== ======================
command-line ``--ignore regex``
mitmproxy shortcut :kbd:`o` then :kbd:`I`
mitmproxy shortcut :kbd:`O` then :kbd:`I`
================== ======================

View File

@@ -48,25 +48,24 @@ In practice, it's pretty common for the replacement literal to be long and
complex. For instance, it might be an XSS exploit that weighs in at hundreds or
thousands of characters. To cope with this, there's a variation of the
replacement hook specifier that lets you load the replacement text from a file.
So, you might start **mitmdump** as follows:
To specify a file as replacement, prefix the file path with ``@``.
You might start **mitmdump** as follows:
>>> mitmdump --replace-from-file :~q:foo:~/xss-exploit
>>> mitmdump --replacements :~q:foo:@~/xss-exploit
This will load the replacement text from the file ``~/xss-exploit``.
Both the ``--replace`` and ``--replace-from-file`` flags can be passed multiple
times.
The ``--replacements`` flag can be passed multiple times.
Interactively
-------------
The :kbd:`R` shortcut key in the mitmproxy options menu (:kbd:`o`) lets you add and edit
The :kbd:`R` shortcut key in the mitmproxy options menu (:kbd:`O`) lets you add and edit
replacement hooks using a built-in editor. The context-sensitive help (:kbd:`?`) has
complete usage information.
================== =======================
command-line ``--replace``,
``--replace-from-file``
mitmproxy shortcut :kbd:`o` then :kbd:`R`
command-line ``--replacements``
mitmproxy shortcut :kbd:`O` then :kbd:`R`
================== =======================

View File

@@ -31,7 +31,8 @@ Host Header
In reverse proxy mode, mitmproxy automatically rewrites the Host header to match the
upstream server. This allows mitmproxy to easily connect to existing endpoints on the
open web (e.g. ``mitmproxy -R https://example.com``).
open web (e.g. ``mitmproxy -R https://example.com``). You can disable this behaviour
by passing ``--keep-host-header`` on the console.
However, keep in mind that absolute URLs within the returned document or HTTP redirects will
NOT be rewritten by mitmproxy. This means that if you click on a link for "http://example.com"
@@ -39,4 +40,4 @@ in the returned web page, you will be taken directly to that URL, bypassing mitm
One possible way to address this is to modify the hosts file of your OS so that "example.com"
resolves to your proxy's IP, and then access the proxy by going directly to example.com.
Make sure that your proxy can still resolve the original IP, or specify an IP in mitmproxy.
Make sure that your proxy can still resolve the original IP, or specify an IP in mitmproxy.

View File

@@ -31,7 +31,7 @@ in the past at the time of replay, and vice versa. Cookie expiry times are
updated in a similar way.
You can turn off response refreshing using the ``--norefresh`` argument, or using
the :kbd:`o` options shortcut within :program:`mitmproxy`.
the :kbd:`O` options shortcut within :program:`mitmproxy`.
Replaying a session recorded in Reverse-proxy Mode

View File

@@ -15,5 +15,5 @@ Example: Set the **Host** header to "example.com" for all requests.
================== =======================
command-line ``--setheader PATTERN``
mitmproxy shortcut :kbd:`o` then :kbd:`H`
mitmproxy shortcut :kbd:`O` then :kbd:`H`
================== =======================

View File

@@ -22,7 +22,7 @@ to interact with the secured resources.
================== ======================
command-line ``-t FILTER``
mitmproxy shortcut :kbd:`o` then :kbd:`t`
mitmproxy shortcut :kbd:`O` then :kbd:`t`
================== ======================
@@ -37,5 +37,5 @@ replay of HTTP Digest authentication.
================== ======================
command-line ``-u FILTER``
mitmproxy shortcut :kbd:`o` then :kbd:`A`
mitmproxy shortcut :kbd:`O` then :kbd:`A`
================== ======================

View File

@@ -19,7 +19,7 @@ How it works
================== ======================
command-line ``--tcp HOST``
mitmproxy shortcut :kbd:`o` then :kbd:`T`
mitmproxy shortcut :kbd:`O` then :kbd:`T`
================== ======================
For a detailed description how the hostname pattern works, please look at the :ref:`passthrough`

View File

@@ -19,5 +19,5 @@ Upstream cert sniffing is on by default, and can optionally be turned off.
================== ======================
command-line ``--no-upstream-cert``
mitmproxy shortcut :kbd:`o` then :kbd:`U`
mitmproxy shortcut :kbd:`O` then :kbd:`U`
================== ======================

View File

@@ -43,7 +43,7 @@ client connects to the proxy and makes a request that looks like this:
CONNECT example.com:443 HTTP/1.1
A conventional proxy can neither view nor manipulate an TLS-encrypted data
A conventional proxy can neither view nor manipulate a TLS-encrypted data
stream, so a CONNECT request simply asks the proxy to open a pipe between the
client and server. The proxy here is just a facilitator - it blindly forwards
data in both directions without knowing anything about the contents. The
@@ -63,7 +63,7 @@ exactly this attack, by allowing a trusted third-party to cryptographically sign
a server's certificates to verify that they are legit. If this signature doesn't
match or is from a non-trusted party, a secure client will simply drop the
connection and refuse to proceed. Despite the many shortcomings of the CA system
as it exists today, this is usually fatal to attempts to MITM an TLS connection
as it exists today, this is usually fatal to attempts to MITM a TLS connection
for analysis. Our answer to this conundrum is to become a trusted Certificate
Authority ourselves. Mitmproxy includes a full CA implementation that generates
interception certificates on the fly. To get the client to trust these
@@ -143,7 +143,7 @@ Lets put all of this together into the complete explicitly proxied HTTPS flow.
2. Mitmproxy responds with a ``200 Connection Established``, as if it has set up the CONNECT pipe.
3. The client believes it's talking to the remote server, and initiates the TLS connection.
It uses SNI to indicate the hostname it is connecting to.
4. Mitmproxy connects to the server, and establishes an TLS connection using the SNI hostname
4. Mitmproxy connects to the server, and establishes a TLS connection using the SNI hostname
indicated by the client.
5. The server responds with the matching certificate, which contains the CN and SAN values
needed to generate the interception certificate.
@@ -217,7 +217,7 @@ explicit HTTPS connections to establish the CN and SANs, and cope with SNI.
destination was.
3. The client believes it's talking to the remote server, and initiates the TLS connection.
It uses SNI to indicate the hostname it is connecting to.
4. Mitmproxy connects to the server, and establishes an TLS connection using the SNI hostname
4. Mitmproxy connects to the server, and establishes a TLS connection using the SNI hostname
indicated by the client.
5. The server responds with the matching certificate, which contains the CN and SAN values
needed to generate the interception certificate.

View File

@@ -110,11 +110,25 @@ libraries. This was tested on a fully patched installation of Fedora 24.
Make sure to have an up-to-date version of pip by running ``pip3 install -U pip``.
.. _install-source-opensuse:
Installation from Source on openSUSE
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
This was tested on a fully patched installation of openSUSE Tumbleweed.
Please note that openSUSE Leap 42.2 only comes with Python 3.4.x, whereas mitmproxy requires Python 3.5 or above.
You can check you Python version by running ``python3 --version``.
.. code:: bash
sudo zypper install python3-pip python3-devel libffi-devel openssl-devel gcc-c++
sudo pip3 install mitmproxy
.. _install-source-windows:
🐱💻 Installation from Source on Windows
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Installation from Source on Windows
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.. note::
Mitmproxy's console interface is not supported on Windows, but you can use

View File

@@ -66,7 +66,7 @@ At the moment, the Grid Editor is used in four parts of mitmproxy:
- Editing request or response headers (:kbd:`e` for edit, then :kbd:`h` for headers in flow view)
- Editing a query string (:kbd:`e` for edit, then :kbd:`q` for query in flow view)
- Editing a URL-encoded form (:kbd:`e` for edit, then :kbd:`f` for form in flow view)
- Editing replacement patterns (:kbd:`o` for options, then :kbd:`R` for Replacement Patterns)
- Editing replacement patterns (:kbd:`O` for options, then :kbd:`R` for Replacement Patterns)
If there is is no data, an empty editor will be started to let you add some.
Here is the editor showing the headers from a request:

View File

@@ -23,7 +23,7 @@ HTTP Request
mode if the server responds correctly. Apart from that, websocket
requests are just like any other, and all aspects of the request
can be over-ridden.
* - h\:\ :ref:`VALUE`\ =\ :ref:`VALUE`\
* - h\ :ref:`VALUE`\ =\ :ref:`VALUE`\
- Set a header.
* - r
- Set the **raw** flag on this response. Pathod will not calculate a
@@ -73,7 +73,7 @@ HTTP Response
* - m\ :ref:`VALUE`
- HTTP Reason message. Automatically chosen according to the response
code if not specified. (HTTP/1 only)
* - h\:\ :ref:`VALUE`\ =\ :ref:`VALUE`\
* - h\ :ref:`VALUE`\ =\ :ref:`VALUE`\
- Set a header.
* - r
- Set the **raw** flag on this response. Pathod will not calculate a

View File

@@ -29,6 +29,12 @@ will be added to all responses passing through the proxy:
>>> mitmdump -s add_header.py
Examples
--------
A collection of addons that demonstrate popular features can be found at :src:`examples/simple`.
Using classes
-------------
@@ -54,24 +60,8 @@ and is replaced by the class instance.
Handling arguments
------------------
Scripts can handle their own command-line arguments, just like any other Python
program. Let's build on the example above to do something slightly more
sophisticated - replace one value with another in all responses. Mitmproxy's
`HTTPRequest <api.html#mitmproxy.models.http.HTTPRequest>`_ and `HTTPResponse
<api.html#mitmproxy.models.http.HTTPResponse>`_ objects have a handy `replace
<api.html#mitmproxy.models.http.HTTPResponse.replace>`_ method that takes care
of all the details for us.
.. literalinclude:: ../../examples/simple/script_arguments.py
:caption: :src:`examples/simple/script_arguments.py`
:language: python
We can now call this script on the command-line like this:
>>> mitmdump -dd -s "./script_arguments.py html faketml"
Whenever a handler is called, mitpmroxy rewrites the script environment so that
it sees its own arguments as if it was invoked from the command-line.
FIXME
Logging and the context

View File

@@ -5,7 +5,6 @@
| change_upstream_proxy.py | Dynamically change the upstream proxy. |
| dns_spoofing.py | Use mitmproxy in a DNS spoofing scenario. |
| dup_and_replay.py | Duplicates each request, changes it, and then replays the modified request. |
| flowbasic.py | Basic use of mitmproxy's FlowMaster directly. |
| full_transparency_shim.c | Setuid wrapper that can be used to run mitmproxy in full transparency mode, as a normal user. |
| har_dump.py | Dump flows as HAR files. |
| mitmproxywrapper.py | Bracket mitmproxy run with proxy enable/disable on OS X |
@@ -16,3 +15,4 @@
| stream_modify.py | Modify a streamed response body. |
| tcp_message.py | Modify a raw TCP connection |
| tls_passthrough.py | Use conditional TLS interception based on a user-defined strategy. |
| xss_scanner.py | Scan all visited webpages. |

View File

@@ -1,11 +1,12 @@
"""
This script makes it possible to use mitmproxy in scenarios where IP spoofing has been used to redirect
connections to mitmproxy. The way this works is that we rely on either the TLS Server Name Indication (SNI) or the
Host header of the HTTP request.
Of course, this is not foolproof - if an HTTPS connection comes without SNI, we don't
know the actual target and cannot construct a certificate that looks valid.
Similarly, if there's no Host header or a spoofed Host header, we're out of luck as well.
Using transparent mode is the better option most of the time.
This script makes it possible to use mitmproxy in scenarios where IP spoofing
has been used to redirect connections to mitmproxy. The way this works is that
we rely on either the TLS Server Name Indication (SNI) or the Host header of the
HTTP request. Of course, this is not foolproof - if an HTTPS connection comes
without SNI, we don't know the actual target and cannot construct a certificate
that looks valid. Similarly, if there's no Host header or a spoofed Host header,
we're out of luck as well. Using transparent mode is the better option most of
the time.
Usage:
mitmproxy
@@ -13,6 +14,8 @@ Usage:
-s dns_spoofing.py
# Used as the target location if neither SNI nor host header are present.
-R http://example.com/
# To avoid auto rewriting of host header by the reverse proxy target.
--keep-host-header
mitmdump
-p 80
-R http://localhost:443/
@@ -29,13 +32,6 @@ parse_host_header = re.compile(r"^(?P<host>[^:]+|\[.+\])(?::(?P<port>\d+))?$")
class Rerouter:
def requestheaders(self, flow):
"""
The original host header is retrieved early
before flow.request is replaced by mitmproxy new outgoing request
"""
flow.metadata["original_host"] = flow.request.host_header
def request(self, flow):
if flow.client_conn.ssl_established:
flow.request.scheme = "https"
@@ -46,7 +42,7 @@ class Rerouter:
sni = None
port = 80
host_header = flow.metadata["original_host"]
host_header = flow.request.host_header
m = parse_host_header.match(host_header)
if m:
host_header = m.group("host").strip("[]")
@@ -58,5 +54,4 @@ class Rerouter:
flow.request.port = port
def start():
return Rerouter()
addons = [Rerouter()]

View File

@@ -1,43 +0,0 @@
#!/usr/bin/env python3
"""
This example shows how to build a proxy based on mitmproxy's Flow
primitives.
Heads Up: In the majority of cases, you want to use inline scripts.
Note that request and response messages are not automatically replied to,
so we need to implement handlers to do this.
"""
from mitmproxy import controller, options, master
from mitmproxy.proxy import ProxyServer, ProxyConfig
class MyMaster(master.Master):
def run(self):
try:
master.Master.run(self)
except KeyboardInterrupt:
self.shutdown()
@controller.handler
def request(self, f):
print("request", f)
@controller.handler
def response(self, f):
print("response", f)
@controller.handler
def error(self, f):
print("error", f)
@controller.handler
def log(self, l):
print("log", l.msg)
opts = options.Options(cadir="~/.mitmproxy/")
config = ProxyConfig(opts)
server = ProxyServer(config)
m = MyMaster(opts, server)
m.run()

View File

@@ -4,17 +4,17 @@ This inline script can be used to dump flows as HAR files.
import json
import sys
import base64
import zlib
import os
from datetime import datetime
import pytz
from datetime import timezone
import mitmproxy
from mitmproxy import version
from mitmproxy import ctx
from mitmproxy.utils import strutils
from mitmproxy.net.http import cookies
@@ -25,17 +25,13 @@ HAR = {}
SERVERS_SEEN = set()
def start():
"""
Called once on script startup before any other events.
"""
if len(sys.argv) != 2:
raise ValueError(
'Usage: -s "har_dump.py filename" '
'(- will output to stdout, filenames ending with .zhar '
'will result in compressed har)'
)
def load(l):
l.add_option(
"hardump", str, "", "HAR dump path.",
)
def configure(updated):
HAR.update({
"log": {
"version": "1.2",
@@ -89,7 +85,7 @@ def response(flow):
# Timings set to -1 will be ignored as per spec.
full_time = sum(v for v in timings.values() if v > -1)
started_date_time = format_datetime(datetime.utcfromtimestamp(flow.request.timestamp_start))
started_date_time = datetime.fromtimestamp(flow.request.timestamp_start, timezone.utc).isoformat()
# Response body size and encoding
response_body_size = len(flow.response.raw_content)
@@ -147,7 +143,7 @@ def response(flow):
}
if flow.server_conn.connected():
entry["serverIPAddress"] = str(flow.server_conn.ip_address.address[0])
entry["serverIPAddress"] = str(flow.server_conn.ip_address[0])
HAR["log"]["entries"].append(entry)
@@ -156,25 +152,20 @@ def done():
"""
Called once on script shutdown, after any other events.
"""
dump_file = sys.argv[1]
if ctx.options.hardump:
json_dump = json.dumps(HAR, indent=2) # type: str
json_dump = json.dumps(HAR, indent=2) # type: str
if ctx.options.hardump == '-':
mitmproxy.ctx.log(json_dump)
else:
raw = json_dump.encode() # type: bytes
if ctx.options.hardump.endswith('.zhar'):
raw = zlib.compress(raw, 9)
if dump_file == '-':
mitmproxy.ctx.log(json_dump)
else:
raw = json_dump.encode() # type: bytes
if dump_file.endswith('.zhar'):
raw = zlib.compress(raw, 9)
with open(os.path.expanduser(ctx.options.hardump), "wb") as f:
f.write(raw)
with open(os.path.expanduser(dump_file), "wb") as f:
f.write(raw)
mitmproxy.ctx.log("HAR dump finished (wrote %s bytes to file)" % len(json_dump))
def format_datetime(dt):
return dt.replace(tzinfo=pytz.timezone("UTC")).isoformat()
mitmproxy.ctx.log("HAR dump finished (wrote %s bytes to file)" % len(json_dump))
def format_cookies(cookie_list):
@@ -198,7 +189,7 @@ def format_cookies(cookie_list):
# Expiration time needs to be formatted
expire_ts = cookies.get_expiration_ts(attrs)
if expire_ts is not None:
cookie_har["expires"] = format_datetime(datetime.fromtimestamp(expire_ts))
cookie_har["expires"] = datetime.fromtimestamp(expire_ts, timezone.utc).isoformat()
rv.append(cookie_har)
@@ -210,7 +201,7 @@ def format_request_cookies(fields):
def format_response_cookies(fields):
return format_cookies((c[0], c[1].value, c[1].attrs) for c in fields)
return format_cookies((c[0], c[1][0], c[1][1]) for c in fields)
def name_value(obj):

View File

@@ -14,6 +14,6 @@ Usage:
"""
def start():
def load(l):
import pydevd
pydevd.settrace("localhost", port=5678, stdoutToServer=True, stderrToServer=True)

View File

@@ -23,10 +23,10 @@ Authors: Maximilian Hils, Matthew Tuusberg
import collections
import random
import sys
from enum import Enum
import mitmproxy
from mitmproxy import ctx
from mitmproxy.exceptions import TlsProtocolException
from mitmproxy.proxy.protocol import TlsLayer, RawTCPLayer
@@ -112,10 +112,16 @@ class TlsFeedback(TlsLayer):
tls_strategy = None
def start():
def load(l):
l.add_option(
"tlsstrat", int, 0, "TLS passthrough strategy (0-100)",
)
def configure(updated):
global tls_strategy
if len(sys.argv) == 2:
tls_strategy = ProbabilisticStrategy(float(sys.argv[1]))
if ctx.options.tlsstrat > 0:
tls_strategy = ProbabilisticStrategy(float(ctx.options.tlsstrat) / 100.0)
else:
tls_strategy = ConservativeStrategy()

407
examples/complex/xss_scanner.py Executable file
View File

@@ -0,0 +1,407 @@
"""
__ __ _____ _____ _____
\ \ / // ____/ ____| / ____|
\ V /| (___| (___ | (___ ___ __ _ _ __ _ __ ___ _ __
> < \___ \\___ \ \___ \ / __/ _` | '_ \| '_ \ / _ \ '__|
/ . \ ____) |___) | ____) | (_| (_| | | | | | | | __/ |
/_/ \_\_____/_____/ |_____/ \___\__,_|_| |_|_| |_|\___|_|
This script automatically scans all visited webpages for XSS and SQLi vulnerabilities.
Usage: mitmproxy -s xss_scanner.py
This script scans for vulnerabilities by injecting a fuzzing payload (see PAYLOAD below) into 4 different places
and examining the HTML to look for XSS and SQLi injection vulnerabilities. The XSS scanning functionality works by
looking to see whether it is possible to inject HTML based off of of where the payload appears in the page and what
characters are escaped. In addition, it also looks for any script tags that load javascript from unclaimed domains.
The SQLi scanning functionality works by using regular expressions to look for errors from a number of different
common databases. Since it is only looking for errors, it will not find blind SQLi vulnerabilities.
The 4 places it injects the payload into are:
1. URLs (e.g. https://example.com/ -> https://example.com/PAYLOAD/)
2. Queries (e.g. https://example.com/index.html?a=b -> https://example.com/index.html?a=PAYLOAD)
3. Referers (e.g. The referer changes from https://example.com to PAYLOAD)
4. User Agents (e.g. The UA changes from Chrome to PAYLOAD)
Reports from this script show up in the event log (viewable by pressing e) and formatted like:
===== XSS Found ====
XSS URL: http://daviddworken.com/vulnerableUA.php
Injection Point: User Agent
Suggested Exploit: <script>alert(0)</script>
Line: 1029zxcs'd"ao<ac>so[sb]po(pc)se;sl/bsl\eq=3847asd
"""
from mitmproxy import ctx
from socket import gaierror, gethostbyname
from urllib.parse import urlparse
import requests
import re
from html.parser import HTMLParser
from mitmproxy import http
from typing import Dict, Union, Tuple, Optional, List, NamedTuple
# The actual payload is put between a frontWall and a backWall to make it easy
# to locate the payload with regular expressions
FRONT_WALL = b"1029zxc"
BACK_WALL = b"3847asd"
PAYLOAD = b"""s'd"ao<ac>so[sb]po(pc)se;sl/bsl\\eq="""
FULL_PAYLOAD = FRONT_WALL + PAYLOAD + BACK_WALL
# A XSSData is a named tuple with the following fields:
# - url -> str
# - injection_point -> str
# - exploit -> str
# - line -> str
XSSData = NamedTuple('XSSData', [('url', str),
('injection_point', str),
('exploit', str),
('line', str)])
# A SQLiData is named tuple with the following fields:
# - url -> str
# - injection_point -> str
# - regex -> str
# - dbms -> str
SQLiData = NamedTuple('SQLiData', [('url', str),
('injection_point', str),
('regex', str),
('dbms', str)])
VulnData = Tuple[Optional[XSSData], Optional[SQLiData]]
Cookies = Dict[str, str]
def get_cookies(flow: http.HTTPFlow) -> Cookies:
""" Return a dict going from cookie names to cookie values
- Note that it includes both the cookies sent in the original request and
the cookies sent by the server """
return {name: value for name, value in flow.request.cookies.fields}
def find_unclaimed_URLs(body: Union[str, bytes], requestUrl: bytes) -> None:
""" Look for unclaimed URLs in script tags and log them if found"""
class ScriptURLExtractor(HTMLParser):
script_URLs = []
def handle_starttag(self, tag, attrs):
if tag == "script" and "src" in [name for name, value in attrs]:
for name, value in attrs:
if name == "src":
self.script_URLs.append(value)
parser = ScriptURLExtractor()
try:
parser.feed(body)
except TypeError:
parser.feed(body.decode('utf-8'))
for url in parser.script_URLs:
parser = urlparse(url)
domain = parser.netloc
try:
gethostbyname(domain)
except gaierror:
ctx.log.error("XSS found in %s due to unclaimed URL \"%s\" in script tag." % (requestUrl, url))
def test_end_of_URL_injection(original_body: str, request_URL: str, cookies: Cookies) -> VulnData:
""" Test the given URL for XSS via injection onto the end of the URL and
log the XSS if found """
parsed_URL = urlparse(request_URL)
path = parsed_URL.path
if path != "" and path[-1] != "/": # ensure the path ends in a /
path += "/"
path += FULL_PAYLOAD.decode('utf-8') # the path must be a string while the payload is bytes
url = parsed_URL._replace(path=path).geturl()
body = requests.get(url, cookies=cookies).text.lower()
xss_info = get_XSS_data(body, url, "End of URL")
sqli_info = get_SQLi_data(body, original_body, url, "End of URL")
return xss_info, sqli_info
def test_referer_injection(original_body: str, request_URL: str, cookies: Cookies) -> VulnData:
""" Test the given URL for XSS via injection into the referer and
log the XSS if found """
body = requests.get(request_URL, headers={'referer': FULL_PAYLOAD}, cookies=cookies).text.lower()
xss_info = get_XSS_data(body, request_URL, "Referer")
sqli_info = get_SQLi_data(body, original_body, request_URL, "Referer")
return xss_info, sqli_info
def test_user_agent_injection(original_body: str, request_URL: str, cookies: Cookies) -> VulnData:
""" Test the given URL for XSS via injection into the user agent and
log the XSS if found """
body = requests.get(request_URL, headers={'User-Agent': FULL_PAYLOAD}, cookies=cookies).text.lower()
xss_info = get_XSS_data(body, request_URL, "User Agent")
sqli_info = get_SQLi_data(body, original_body, request_URL, "User Agent")
return xss_info, sqli_info
def test_query_injection(original_body: str, request_URL: str, cookies: Cookies):
""" Test the given URL for XSS via injection into URL queries and
log the XSS if found """
parsed_URL = urlparse(request_URL)
query_string = parsed_URL.query
# queries is a list of parameters where each parameter is set to the payload
queries = [query.split("=")[0] + "=" + FULL_PAYLOAD.decode('utf-8') for query in query_string.split("&")]
new_query_string = "&".join(queries)
new_URL = parsed_URL._replace(query=new_query_string).geturl()
body = requests.get(new_URL, cookies=cookies).text.lower()
xss_info = get_XSS_data(body, new_URL, "Query")
sqli_info = get_SQLi_data(body, original_body, new_URL, "Query")
return xss_info, sqli_info
def log_XSS_data(xss_info: Optional[XSSData]) -> None:
""" Log information about the given XSS to mitmproxy """
# If it is None, then there is no info to log
if not xss_info:
return
ctx.log.error("===== XSS Found ====")
ctx.log.error("XSS URL: %s" % xss_info.url)
ctx.log.error("Injection Point: %s" % xss_info.injection_point)
ctx.log.error("Suggested Exploit: %s" % xss_info.exploit)
ctx.log.error("Line: %s" % xss_info.line)
def log_SQLi_data(sqli_info: Optional[SQLiData]) -> None:
""" Log information about the given SQLi to mitmproxy """
if not sqli_info:
return
ctx.log.error("===== SQLi Found =====")
ctx.log.error("SQLi URL: %s" % sqli_info.url.decode('utf-8'))
ctx.log.error("Injection Point: %s" % sqli_info.injection_point.decode('utf-8'))
ctx.log.error("Regex used: %s" % sqli_info.regex.decode('utf-8'))
ctx.log.error("Suspected DBMS: %s" % sqli_info.dbms.decode('utf-8'))
def get_SQLi_data(new_body: str, original_body: str, request_URL: str, injection_point: str) -> Optional[SQLiData]:
""" Return a SQLiDict if there is a SQLi otherwise return None
String String URL String -> (SQLiDict or None) """
# Regexes taken from Damn Small SQLi Scanner: https://github.com/stamparm/DSSS/blob/master/dsss.py#L17
DBMS_ERRORS = {
"MySQL": (r"SQL syntax.*MySQL", r"Warning.*mysql_.*", r"valid MySQL result", r"MySqlClient\."),
"PostgreSQL": (r"PostgreSQL.*ERROR", r"Warning.*\Wpg_.*", r"valid PostgreSQL result", r"Npgsql\."),
"Microsoft SQL Server": (r"Driver.* SQL[\-\_\ ]*Server", r"OLE DB.* SQL Server", r"(\W|\A)SQL Server.*Driver",
r"Warning.*mssql_.*", r"(\W|\A)SQL Server.*[0-9a-fA-F]{8}",
r"(?s)Exception.*\WSystem\.Data\.SqlClient\.", r"(?s)Exception.*\WRoadhouse\.Cms\."),
"Microsoft Access": (r"Microsoft Access Driver", r"JET Database Engine", r"Access Database Engine"),
"Oracle": (r"\bORA-[0-9][0-9][0-9][0-9]", r"Oracle error", r"Oracle.*Driver", r"Warning.*\Woci_.*", r"Warning.*\Wora_.*"),
"IBM DB2": (r"CLI Driver.*DB2", r"DB2 SQL error", r"\bdb2_\w+\("),
"SQLite": (r"SQLite/JDBCDriver", r"SQLite.Exception", r"System.Data.SQLite.SQLiteException", r"Warning.*sqlite_.*",
r"Warning.*SQLite3::", r"\[SQLITE_ERROR\]"),
"Sybase": (r"(?i)Warning.*sybase.*", r"Sybase message", r"Sybase.*Server message.*"),
}
for dbms, regexes in DBMS_ERRORS.items():
for regex in regexes:
if re.search(regex, new_body) and not re.search(regex, original_body):
return SQLiData(request_URL,
injection_point,
regex,
dbms)
# A qc is either ' or "
def inside_quote(qc: str, substring: bytes, text_index: int, body: bytes) -> bool:
""" Whether the Numberth occurence of the first string in the second
string is inside quotes as defined by the supplied QuoteChar """
substring = substring.decode('utf-8')
body = body.decode('utf-8')
num_substrings_found = 0
in_quote = False
for index, char in enumerate(body):
# Whether the next chunk of len(substring) chars is the substring
next_part_is_substring = (
(not (index + len(substring) > len(body))) and
(body[index:index + len(substring)] == substring)
)
# Whether this char is escaped with a \
is_not_escaped = (
(index - 1 < 0 or index - 1 > len(body)) or
(body[index - 1] != "\\")
)
if char == qc and is_not_escaped:
in_quote = not in_quote
if next_part_is_substring:
if num_substrings_found == text_index:
return in_quote
num_substrings_found += 1
return False
def paths_to_text(html: str, str: str) -> List[str]:
""" Return list of Paths to a given str in the given HTML tree
- Note that it does a BFS """
def remove_last_occurence_of_sub_string(str: str, substr: str):
""" Delete the last occurence of substr from str
String String -> String
"""
index = str.rfind(substr)
return str[:index] + str[index + len(substr):]
class PathHTMLParser(HTMLParser):
currentPath = ""
paths = []
def handle_starttag(self, tag, attrs):
self.currentPath += ("/" + tag)
def handle_endtag(self, tag):
self.currentPath = remove_last_occurence_of_sub_string(self.currentPath, "/" + tag)
def handle_data(self, data):
if str in data:
self.paths.append(self.currentPath)
parser = PathHTMLParser()
parser.feed(html)
return parser.paths
def get_XSS_data(body: str, request_URL: str, injection_point: str) -> Optional[XSSData]:
""" Return a XSSDict if there is a XSS otherwise return None """
def in_script(text, index, body) -> bool:
""" Whether the Numberth occurence of the first string in the second
string is inside a script tag """
paths = paths_to_text(body.decode('utf-8'), text.decode("utf-8"))
try:
path = paths[index]
return "script" in path
except IndexError:
return False
def in_HTML(text: bytes, index: int, body: bytes) -> bool:
""" Whether the Numberth occurence of the first string in the second
string is inside the HTML but not inside a script tag or part of
a HTML attribute"""
# if there is a < then lxml will interpret that as a tag, so only search for the stuff before it
text = text.split(b"<")[0]
paths = paths_to_text(body.decode('utf-8'), text.decode("utf-8"))
try:
path = paths[index]
return "script" not in path
except IndexError:
return False
def inject_javascript_handler(html: str) -> bool:
""" Whether you can inject a Javascript:alert(0) as a link """
class injectJSHandlerHTMLParser(HTMLParser):
injectJSHandler = False
def handle_starttag(self, tag, attrs):
for name, value in attrs:
if name == "href" and value.startswith(FRONT_WALL.decode('utf-8')):
self.injectJSHandler = True
parser = injectJSHandlerHTMLParser()
parser.feed(html)
return parser.injectJSHandler
# Only convert the body to bytes if needed
if isinstance(body, str):
body = bytes(body, 'utf-8')
# Regex for between 24 and 72 (aka 24*3) characters encapsulated by the walls
regex = re.compile(b"""%s.{24,72}?%s""" % (FRONT_WALL, BACK_WALL))
matches = regex.findall(body)
for index, match in enumerate(matches):
# Where the string is injected into the HTML
in_script = in_script(match, index, body)
in_HTML = in_HTML(match, index, body)
in_tag = not in_script and not in_HTML
in_single_quotes = inside_quote("'", match, index, body)
in_double_quotes = inside_quote('"', match, index, body)
# Whether you can inject:
inject_open_angle = b"ao<ac" in match # open angle brackets
inject_close_angle = b"ac>so" in match # close angle brackets
inject_single_quotes = b"s'd" in match # single quotes
inject_double_quotes = b'd"ao' in match # double quotes
inject_slash = b"sl/bsl" in match # forward slashes
inject_semi = b"se;sl" in match # semicolons
inject_equals = b"eq=" in match # equals sign
if in_script and inject_slash and inject_open_angle and inject_close_angle: # e.g. <script>PAYLOAD</script>
return XSSData(request_URL,
injection_point,
'</script><script>alert(0)</script><script>',
match.decode('utf-8'))
elif in_script and in_single_quotes and inject_single_quotes and inject_semi: # e.g. <script>t='PAYLOAD';</script>
return XSSData(request_URL,
injection_point,
"';alert(0);g='",
match.decode('utf-8'))
elif in_script and in_double_quotes and inject_double_quotes and inject_semi: # e.g. <script>t="PAYLOAD";</script>
return XSSData(request_URL,
injection_point,
'";alert(0);g="',
match.decode('utf-8'))
elif in_tag and in_single_quotes and inject_single_quotes and inject_open_angle and inject_close_angle and inject_slash:
# e.g. <a href='PAYLOAD'>Test</a>
return XSSData(request_URL,
injection_point,
"'><script>alert(0)</script>",
match.decode('utf-8'))
elif in_tag and in_double_quotes and inject_double_quotes and inject_open_angle and inject_close_angle and inject_slash:
# e.g. <a href="PAYLOAD">Test</a>
return XSSData(request_URL,
injection_point,
'"><script>alert(0)</script>',
match.decode('utf-8'))
elif in_tag and not in_double_quotes and not in_single_quotes and inject_open_angle and inject_close_angle and inject_slash:
# e.g. <a href=PAYLOAD>Test</a>
return XSSData(request_URL,
injection_point,
'><script>alert(0)</script>',
match.decode('utf-8'))
elif inject_javascript_handler(body.decode('utf-8')): # e.g. <html><a href=PAYLOAD>Test</a>
return XSSData(request_URL,
injection_point,
'Javascript:alert(0)',
match.decode('utf-8'))
elif in_tag and in_double_quotes and inject_double_quotes and inject_equals: # e.g. <a href="PAYLOAD">Test</a>
return XSSData(request_URL,
injection_point,
'" onmouseover="alert(0)" t="',
match.decode('utf-8'))
elif in_tag and in_single_quotes and inject_single_quotes and inject_equals: # e.g. <a href='PAYLOAD'>Test</a>
return XSSData(request_URL,
injection_point,
"' onmouseover='alert(0)' t='",
match.decode('utf-8'))
elif in_tag and not in_single_quotes and not in_double_quotes and inject_equals: # e.g. <a href=PAYLOAD>Test</a>
return XSSData(request_URL,
injection_point,
" onmouseover=alert(0) t=",
match.decode('utf-8'))
elif in_HTML and not in_script and inject_open_angle and inject_close_angle and inject_slash: # e.g. <html>PAYLOAD</html>
return XSSData(request_URL,
injection_point,
'<script>alert(0)</script>',
match.decode('utf-8'))
else:
return None
# response is mitmproxy's entry point
def response(flow: http.HTTPFlow) -> None:
cookiesDict = get_cookies(flow)
# Example: http://xss.guru/unclaimedScriptTag.html
find_unclaimed_URLs(flow.response.content, flow.request.url)
results = test_end_of_URL_injection(flow.response.content.decode('utf-8'), flow.request.url, cookiesDict)
log_XSS_data(results[0])
log_SQLi_data(results[1])
# Example: https://daviddworken.com/vulnerableReferer.php
results = test_referer_injection(flow.response.content.decode('utf-8'), flow.request.url, cookiesDict)
log_XSS_data(results[0])
log_SQLi_data(results[1])
# Example: https://daviddworken.com/vulnerableUA.php
results = test_user_agent_injection(flow.response.content.decode('utf-8'), flow.request.url, cookiesDict)
log_XSS_data(results[0])
log_SQLi_data(results[1])
if "?" in flow.request.url:
# Example: https://daviddworken.com/vulnerable.php?name=
results = test_query_injection(flow.response.content.decode('utf-8'), flow.request.url, cookiesDict)
log_XSS_data(results[0])
log_SQLi_data(results[1])

View File

@@ -1,2 +1,5 @@
def response(flow):
from mitmproxy import http
def response(flow: http.HTTPFlow) -> None:
flow.response.headers["newheader"] = "foo"

View File

@@ -1,7 +1,9 @@
from mitmproxy import http
class AddHeader:
def response(self, flow):
def response(self, flow: http.HTTPFlow) -> None:
flow.response.headers["newheader"] = "foo"
def start():
return AddHeader()
addons = [AddHeader()]

View File

@@ -3,6 +3,10 @@ This example shows how one can add a custom contentview to mitmproxy.
The content view API is explained in the mitmproxy.contentviews module.
"""
from mitmproxy import contentviews
import typing
CVIEWSWAPCASE = typing.Tuple[str, typing.Iterable[typing.List[typing.Tuple[str, typing.AnyStr]]]]
class ViewSwapCase(contentviews.View):
@@ -13,14 +17,14 @@ class ViewSwapCase(contentviews.View):
prompt = ("swap case text", "z")
content_types = ["text/plain"]
def __call__(self, data: bytes, **metadata):
def __call__(self, data: typing.AnyStr, **metadata) -> CVIEWSWAPCASE:
return "case-swapped text", contentviews.format_text(data.swapcase())
view = ViewSwapCase()
def start():
def load(l):
contentviews.add(view)

View File

@@ -0,0 +1,11 @@
from mitmproxy import ctx
def load(l):
ctx.log.info("Registering option 'custom'")
l.add_option("custom", bool, False, "A custom option")
def configure(updated):
if "custom" in updated:
ctx.log.info("custom option value: %s" % ctx.options.custom)

View File

@@ -1,23 +1,26 @@
"""
This scripts demonstrates how to use mitmproxy's filter pattern in scripts.
Usage:
mitmdump -s "flowfilter.py FILTER"
"""
import sys
from mitmproxy import flowfilter
from mitmproxy import ctx, http
class Filter:
def __init__(self, spec):
self.filter = flowfilter.parse(spec)
def __init__(self):
self.filter = None # type: flowfilter.TFilter
def response(self, flow):
def configure(self, updated):
self.filter = flowfilter.parse(ctx.options.flowfilter)
def load(self, l):
l.add_option(
"flowfilter", str, "", "Check that flow matches filter."
)
def response(self, flow: http.HTTPFlow) -> None:
if flowfilter.match(self.filter, flow):
print("Flow matches filter:")
print(flow)
def start():
if len(sys.argv) != 2:
raise ValueError("Usage: -s 'filt.py FILTER'")
return Filter(sys.argv[1])
addons = [Filter()]

View File

@@ -1,13 +1,15 @@
#!/usr/bin/env python
# type: ignore
#
# Simple script showing how to read a mitmproxy dump file
#
from mitmproxy import io
from mitmproxy.exceptions import FlowReadException
import pprint
import sys
with open(sys.argv[1], "rb") as logfile:
freader = io.FlowReader(logfile)
pp = pprint.PrettyPrinter(indent=4)

View File

@@ -7,23 +7,21 @@ to multiple files in parallel.
"""
import random
import sys
from mitmproxy import io
from mitmproxy import io, http
import typing # noqa
class Writer:
def __init__(self, path):
def __init__(self, path: str) -> None:
if path == "-":
f = sys.stdout
f = sys.stdout # type: typing.IO[typing.Any]
else:
f = open(path, "wb")
self.w = io.FlowWriter(f)
def response(self, flow):
def response(self, flow: http.HTTPFlow) -> None:
if random.choice([True, False]):
self.w.add(flow)
def start():
if len(sys.argv) != 2:
raise ValueError('Usage: -s "flowriter.py filename"')
return Writer(sys.argv[1])
addons = [Writer(sys.argv[1])]

View File

@@ -7,6 +7,6 @@ If you want to help us out: https://github.com/mitmproxy/mitmproxy/issues/1530 :
from mitmproxy import ctx
def start():
def load(l):
ctx.log.info("This is some informative text.")
ctx.log.error("This is an error.")

View File

@@ -1,29 +1,26 @@
# Usage: mitmdump -s "iframe_injector.py url"
# (this script works best with --anticache)
import sys
from bs4 import BeautifulSoup
from mitmproxy import ctx, http
class Injector:
def __init__(self, iframe_url):
self.iframe_url = iframe_url
def load(self, loader):
loader.add_option(
"iframe", str, "", "IFrame to inject"
)
def response(self, flow):
if flow.request.host in self.iframe_url:
return
html = BeautifulSoup(flow.response.content, "html.parser")
if html.body:
iframe = html.new_tag(
"iframe",
src=self.iframe_url,
frameborder=0,
height=0,
width=0)
html.body.insert(0, iframe)
flow.response.content = str(html).encode("utf8")
def response(self, flow: http.HTTPFlow) -> None:
if ctx.options.iframe:
html = BeautifulSoup(flow.response.content, "html.parser")
if html.body:
iframe = html.new_tag(
"iframe",
src=ctx.options.iframe,
frameborder=0,
height=0,
width=0)
html.body.insert(0, iframe)
flow.response.content = str(html).encode("utf8")
def start():
if len(sys.argv) != 2:
raise ValueError('Usage: -s "iframe_injector.py url"')
return Injector(sys.argv[1])
addons = [Injector()]

View File

@@ -1,4 +1,7 @@
def request(flow):
from mitmproxy import http
def request(flow: http.HTTPFlow) -> None:
if flow.request.urlencoded_form:
# If there's already a form, one can just add items to the dict:
flow.request.urlencoded_form["mitmproxy"] = "rocks"

View File

@@ -1,2 +1,5 @@
def request(flow):
from mitmproxy import http
def request(flow: http.HTTPFlow) -> None:
flow.request.query["mitmproxy"] = "rocks"

View File

@@ -1,9 +1,10 @@
"""
This example shows two ways to redirect flows to another server.
"""
from mitmproxy import http
def request(flow):
def request(flow: http.HTTPFlow) -> None:
# pretty_host takes the "Host" header of the request into account,
# which is useful in transparent mode where we usually only have the IP
# otherwise.

View File

@@ -1,17 +0,0 @@
import argparse
class Replacer:
def __init__(self, src, dst):
self.src, self.dst = src, dst
def response(self, flow):
flow.response.replace(self.src, self.dst)
def start():
parser = argparse.ArgumentParser()
parser.add_argument("src", type=str)
parser.add_argument("dst", type=str)
args = parser.parse_args()
return Replacer(args.src, args.dst)

View File

@@ -5,7 +5,7 @@ without sending any data to the remote server.
from mitmproxy import http
def request(flow):
def request(flow: http.HTTPFlow) -> None:
# pretty_url takes the "Host" header of the request into account, which
# is useful in transparent mode where we usually only have the IP otherwise.

View File

@@ -2,11 +2,11 @@
This script rotates all images passing through the proxy by 180 degrees.
"""
import io
from PIL import Image
from mitmproxy import http
def response(flow):
def response(flow: http.HTTPFlow) -> None:
if flow.response.headers.get("content-type", "").startswith("image"):
s = io.BytesIO(flow.response.content)
img = Image.open(s).rotate(180)

View File

@@ -10,14 +10,14 @@ app = Flask("proxapp")
@app.route('/')
def hello_world():
def hello_world() -> str:
return 'Hello World!'
def start():
# Host app at the magic domain "proxapp" on port 80. Requests to this
def load(l):
# Host app at the magic domain "proxapp.local" on port 80. Requests to this
# domain and port combination will now be routed to the WSGI app instance.
return wsgiapp.WSGIApp(app, "proxapp", 80)
return wsgiapp.WSGIApp(app, "proxapp.local", 80)
# SSL works too, but the magic domain needs to be resolvable from the mitmproxy machine due to mitmproxy's design.
# mitmproxy will connect to said domain and use serve its certificate (unless --no-upstream-cert is set)

View File

@@ -1,4 +1,13 @@
import typing
import traceback
import contextlib
import sys
from mitmproxy import exceptions
from mitmproxy import eventsequence
from mitmproxy import controller
from mitmproxy import flow
from . import ctx
import pprint
@@ -6,18 +15,115 @@ def _get_name(itm):
return getattr(itm, "name", itm.__class__.__name__.lower())
def cut_traceback(tb, func_name):
"""
Cut off a traceback at the function with the given name.
The func_name's frame is excluded.
Args:
tb: traceback object, as returned by sys.exc_info()[2]
func_name: function name
Returns:
Reduced traceback.
"""
tb_orig = tb
for _, _, fname, _ in traceback.extract_tb(tb):
tb = tb.tb_next
if fname == func_name:
break
return tb or tb_orig
class StreamLog:
"""
A class for redirecting output using contextlib.
"""
def __init__(self, log):
self.log = log
def write(self, buf):
if buf.strip():
self.log(buf)
def flush(self): # pragma: no cover
# Click uses flush sometimes, so we dummy it up
pass
@contextlib.contextmanager
def safecall():
stdout_replacement = StreamLog(ctx.log.warn)
try:
with contextlib.redirect_stdout(stdout_replacement):
yield
except (exceptions.AddonHalt, exceptions.OptionsError):
raise
except Exception as e:
etype, value, tb = sys.exc_info()
tb = cut_traceback(tb, "invoke_addon").tb_next
ctx.log.error(
"Addon error: %s" % "".join(
traceback.format_exception(etype, value, tb)
)
)
class Loader:
"""
A loader object is passed to the load() event when addons start up.
"""
def __init__(self, master):
self.master = master
def add_option(
self,
name: str,
typespec: type,
default: typing.Any,
help: str,
choices: typing.Optional[typing.Sequence[str]] = None
) -> None:
if name in self.master.options:
ctx.log.warn("Over-riding existing option %s" % name)
self.master.options.add_option(
name,
typespec,
default,
help,
choices
)
def add_command(self, path: str, func: typing.Callable) -> None:
self.master.commands.add(path, func)
def traverse(chain):
"""
Recursively traverse an addon chain.
"""
for a in chain:
yield a
if hasattr(a, "addons"):
yield from traverse(a.addons)
class AddonManager:
def __init__(self, master):
self.lookup = {}
self.chain = []
self.master = master
master.options.changed.connect(self._options_update)
master.options.changed.connect(self._configure_all)
def _configure_all(self, options, updated):
self.trigger("configure", updated)
def clear(self):
"""
Remove all addons.
"""
self.done()
self.chain = []
for i in self.chain:
self.remove(i)
def get(self, name):
"""
@@ -25,45 +131,58 @@ class AddonManager:
attribute on the instance, or the lower case class name if that
does not exist.
"""
for i in self.chain:
if name == _get_name(i):
return i
return self.lookup.get(name, None)
def _options_update(self, options, updated):
for i in self.chain:
with self.master.handlecontext():
self.invoke_with_context(i, "configure", options, updated)
def register(self, addon):
"""
Register an addon, call its load event, and then register all its
sub-addons. This should be used by addons that dynamically manage
addons.
def startup(self, s):
If the calling addon is already running, it should follow with
running and configure events. Must be called within a current
context.
"""
Run startup events on addon.
"""
self.invoke_with_context(s, "start")
self.invoke_with_context(
s,
"configure",
self.master.options,
self.master.options.keys()
)
for a in traverse([addon]):
name = _get_name(a)
if name in self.lookup:
raise exceptions.AddonManagerError(
"An addon called '%s' already exists." % name
)
l = Loader(self.master)
self.invoke_addon(addon, "load", l)
for a in traverse([addon]):
name = _get_name(a)
self.lookup[name] = a
for a in traverse([addon]):
self.master.commands.collect_commands(a)
return addon
def add(self, *addons):
"""
Add addons to the end of the chain, and run their startup events.
Add addons to the end of the chain, and run their load event.
If any addon has sub-addons, they are registered.
"""
self.chain.extend(addons)
for i in addons:
self.startup(i)
with self.master.handlecontext():
for i in addons:
self.chain.append(self.register(i))
def remove(self, addon):
"""
Remove an addon from the chain, and run its done events.
"""
self.chain = [i for i in self.chain if i is not addon]
self.invoke_with_context(addon, "done")
Remove an addon and all its sub-addons.
def done(self):
for i in self.chain:
self.invoke_with_context(i, "done")
If the addon is not in the chain - that is, if it's managed by a
parent addon - it's the parent's responsibility to remove it from
its own addons attribute.
"""
for a in traverse([addon]):
n = _get_name(a)
if n not in self.lookup:
raise exceptions.AddonManagerError("No such addon: %s" % n)
self.chain = [i for i in self.chain if i is not a]
del self.lookup[_get_name(a)]
with self.master.handlecontext():
self.invoke_addon(a, "done")
def __len__(self):
return len(self.chain)
@@ -71,22 +190,59 @@ class AddonManager:
def __str__(self):
return pprint.pformat([str(i) for i in self.chain])
def invoke_with_context(self, addon, name, *args, **kwargs):
def handle_lifecycle(self, name, message):
"""
Handle a lifecycle event.
"""
if not hasattr(message, "reply"): # pragma: no cover
raise exceptions.ControlException(
"Message %s has no reply attribute" % message
)
# We can use DummyReply objects multiple times. We only clear them up on
# the next handler so that we can access value and state in the
# meantime.
if isinstance(message.reply, controller.DummyReply):
message.reply.reset()
self.trigger(name, message)
if message.reply.state != "taken":
message.reply.take()
if not message.reply.has_message:
message.reply.ack()
message.reply.commit()
if isinstance(message.reply, controller.DummyReply):
message.reply.mark_reset()
if isinstance(message, flow.Flow):
self.trigger("update", [message])
def invoke_addon(self, addon, name, *args, **kwargs):
"""
Invoke an event on an addon and all its children. This method must
run within an established handler context.
"""
if name not in eventsequence.Events:
name = "event_" + name
for a in traverse([addon]):
func = getattr(a, name, None)
if func:
if not callable(func):
raise exceptions.AddonManagerError(
"Addon handler %s not callable" % name
)
func(*args, **kwargs)
def trigger(self, name, *args, **kwargs):
"""
Establish a handler context and trigger an event across all addons
"""
with self.master.handlecontext():
self.invoke(addon, name, *args, **kwargs)
def invoke(self, addon, name, *args, **kwargs):
func = getattr(addon, name, None)
if func:
if not callable(func):
raise exceptions.AddonError(
"Addon handler %s not callable" % name
)
func(*args, **kwargs)
def __call__(self, name, *args, **kwargs):
for i in self.chain:
try:
self.invoke(i, name, *args, **kwargs)
except exceptions.AddonHalt:
return
for i in self.chain:
try:
with safecall():
self.invoke_addon(i, name, *args, **kwargs)
except exceptions.AddonHalt:
return

View File

@@ -3,7 +3,11 @@ from mitmproxy.addons import anticomp
from mitmproxy.addons import check_alpn
from mitmproxy.addons import check_ca
from mitmproxy.addons import clientplayback
from mitmproxy.addons import disable_h2c_upgrade
from mitmproxy.addons import core_option_validation
from mitmproxy.addons import core
from mitmproxy.addons import cut
from mitmproxy.addons import disable_h2c
from mitmproxy.addons import export
from mitmproxy.addons import onboarding
from mitmproxy.addons import proxyauth
from mitmproxy.addons import replace
@@ -13,28 +17,31 @@ from mitmproxy.addons import setheaders
from mitmproxy.addons import stickyauth
from mitmproxy.addons import stickycookie
from mitmproxy.addons import streambodies
from mitmproxy.addons import streamfile
from mitmproxy.addons import save
from mitmproxy.addons import upstream_auth
def default_addons():
return [
core.Core(),
core_option_validation.CoreOptionValidation(),
anticache.AntiCache(),
anticomp.AntiComp(),
check_alpn.CheckALPN(),
check_ca.CheckCA(),
clientplayback.ClientPlayback(),
disable_h2c_upgrade.DisableH2CleartextUpgrade(),
cut.Cut(),
disable_h2c.DisableH2C(),
export.Export(),
onboarding.Onboarding(),
proxyauth.ProxyAuth(),
replace.Replace(),
replace.ReplaceFile(),
script.ScriptLoader(),
serverplayback.ServerPlayback(),
setheaders.SetHeaders(),
stickyauth.StickyAuth(),
stickycookie.StickyCookie(),
streambodies.StreamBodies(),
streamfile.StreamFile(),
save.Save(),
upstream_auth.UpstreamAuth(),
]

View File

@@ -1,10 +1,7 @@
from mitmproxy import ctx
class AntiCache:
def __init__(self):
self.enabled = False
def configure(self, options, updated):
self.enabled = options.anticache
def request(self, flow):
if self.enabled:
if ctx.options.anticache:
flow.request.anticache()

View File

@@ -1,10 +1,7 @@
from mitmproxy import ctx
class AntiComp:
def __init__(self):
self.enabled = False
def configure(self, options, updated):
self.enabled = options.anticomp
def request(self, flow):
if self.enabled:
if ctx.options.anticomp:
flow.request.anticomp()

View File

@@ -1,17 +1,17 @@
import mitmproxy
from mitmproxy.net import tcp
from mitmproxy import ctx
class CheckALPN:
def __init__(self):
self.failed = False
def configure(self, options, updated):
def configure(self, updated):
self.failed = mitmproxy.ctx.master.options.http2 and not tcp.HAS_ALPN
if self.failed:
mitmproxy.ctx.master.add_log(
ctx.log.warn(
"HTTP/2 is disabled because ALPN support missing!\n"
"OpenSSL 1.0.2+ required to support HTTP/2 connections.\n"
"Use --no-http2 to silence this warning.",
"warn",
"Use --no-http2 to silence this warning."
)

View File

@@ -5,7 +5,7 @@ class CheckCA:
def __init__(self):
self.failed = False
def configure(self, options, updated):
def configure(self, updated):
has_ca = (
mitmproxy.ctx.master.server and
mitmproxy.ctx.master.server.config and

View File

@@ -2,6 +2,7 @@ from mitmproxy import exceptions
from mitmproxy import ctx
from mitmproxy import io
from mitmproxy import flow
from mitmproxy import command
import typing
@@ -10,36 +11,56 @@ class ClientPlayback:
def __init__(self):
self.flows = None
self.current_thread = None
self.keepserving = False
self.has_replayed = False
self.configured = False
def count(self) -> int:
if self.flows:
return len(self.flows)
return 0
def load(self, flows: typing.Sequence[flow.Flow]):
@command.command("replay.client.stop")
def stop_replay(self) -> None:
"""
Stop client replay.
"""
self.flows = []
ctx.master.addons.trigger("update", [])
@command.command("replay.client")
def start_replay(self, flows: typing.Sequence[flow.Flow]) -> None:
"""
Replay requests from flows.
"""
self.flows = flows
ctx.master.addons.trigger("update", [])
@command.command("replay.client.file")
def load_file(self, path: str) -> None:
try:
flows = io.read_flows_from_paths([path])
except exceptions.FlowReadException as e:
raise exceptions.CommandError(str(e))
self.flows = flows
def configure(self, options, updated):
if "client_replay" in updated:
if options.client_replay:
ctx.log.info("Client Replay: {}".format(options.client_replay))
try:
flows = io.read_flows_from_paths(options.client_replay)
except exceptions.FlowReadException as e:
raise exceptions.OptionsError(str(e))
self.load(flows)
else:
self.flows = None
self.keepserving = options.keepserving
def configure(self, updated):
if not self.configured and ctx.options.client_replay:
self.configured = True
ctx.log.info("Client Replay: {}".format(ctx.options.client_replay))
try:
flows = io.read_flows_from_paths(ctx.options.client_replay)
except exceptions.FlowReadException as e:
raise exceptions.OptionsError(str(e))
self.start_replay(flows)
def tick(self):
if self.current_thread and not self.current_thread.is_alive():
self.current_thread = None
if self.flows and not self.current_thread:
self.current_thread = ctx.master.replay_request(self.flows.pop(0))
f = self.flows.pop(0)
self.current_thread = ctx.master.replay_request(f)
ctx.master.addons.trigger("update", [f])
self.has_replayed = True
if self.has_replayed:
if not self.flows and not self.current_thread and not self.keepserving:
ctx.master.shutdown()
if not self.flows and not self.current_thread:
ctx.master.addons.trigger("processing_complete")

259
mitmproxy/addons/core.py Normal file
View File

@@ -0,0 +1,259 @@
import typing
from mitmproxy import ctx
from mitmproxy import exceptions
from mitmproxy import command
from mitmproxy import flow
from mitmproxy import optmanager
from mitmproxy.net.http import status_codes
class Core:
@command.command("set")
def set(self, spec: str) -> None:
"""
Set an option of the form "key[=value]". When the value is omitted,
booleans are set to true, strings and integers are set to None (if
permitted), and sequences are emptied. Boolean values can be true,
false or toggle.
"""
try:
ctx.options.set(spec)
except exceptions.OptionsError as e:
raise exceptions.CommandError(e) from e
@command.command("flow.resume")
def resume(self, flows: typing.Sequence[flow.Flow]) -> None:
"""
Resume flows if they are intercepted.
"""
intercepted = [i for i in flows if i.intercepted]
for f in intercepted:
f.resume()
ctx.master.addons.trigger("update", intercepted)
# FIXME: this will become view.mark later
@command.command("flow.mark")
def mark(self, flows: typing.Sequence[flow.Flow], val: bool) -> None:
"""
Mark flows.
"""
updated = []
for i in flows:
if i.marked != val:
i.marked = val
updated.append(i)
ctx.master.addons.trigger("update", updated)
# FIXME: this will become view.mark.toggle later
@command.command("flow.mark.toggle")
def mark_toggle(self, flows: typing.Sequence[flow.Flow]) -> None:
"""
Toggle mark for flows.
"""
for i in flows:
i.marked = not i.marked
ctx.master.addons.trigger("update", flows)
@command.command("flow.kill")
def kill(self, flows: typing.Sequence[flow.Flow]) -> None:
"""
Kill running flows.
"""
updated = []
for f in flows:
if f.killable:
f.kill()
updated.append(f)
ctx.log.alert("Killed %s flows." % len(updated))
ctx.master.addons.trigger("update", updated)
# FIXME: this will become view.revert later
@command.command("flow.revert")
def revert(self, flows: typing.Sequence[flow.Flow]) -> None:
"""
Revert flow changes.
"""
updated = []
for f in flows:
if f.modified():
f.revert()
updated.append(f)
ctx.log.alert("Reverted %s flows." % len(updated))
ctx.master.addons.trigger("update", updated)
@command.command("flow.set.options")
def flow_set_options(self) -> typing.Sequence[str]:
return [
"host",
"status_code",
"method",
"path",
"url",
"reason",
]
@command.command("flow.set")
def flow_set(
self,
flows: typing.Sequence[flow.Flow], spec: str, sval: str
) -> None:
"""
Quickly set a number of common values on flows.
"""
opts = self.flow_set_options()
if spec not in opts:
raise exceptions.CommandError(
"Set spec must be one of: %s." % ", ".join(opts)
)
val = sval # type: typing.Union[int, str]
if spec == "status_code":
try:
val = int(val)
except ValueError as v:
raise exceptions.CommandError(
"Status code is not an integer: %s" % val
) from v
updated = []
for f in flows:
req = getattr(f, "request", None)
rupdate = True
if req:
if spec == "method":
req.method = val
elif spec == "host":
req.host = val
elif spec == "path":
req.path = val
elif spec == "url":
try:
req.url = val
except ValueError as e:
raise exceptions.CommandError(
"URL %s is invalid: %s" % (repr(val), e)
) from e
else:
self.rupdate = False
resp = getattr(f, "response", None)
supdate = True
if resp:
if spec == "status_code":
resp.status_code = val
if val in status_codes.RESPONSES:
resp.reason = status_codes.RESPONSES[int(val)]
elif spec == "reason":
resp.reason = val
else:
supdate = False
if rupdate or supdate:
updated.append(f)
ctx.master.addons.trigger("update", updated)
ctx.log.alert("Set %s on %s flows." % (spec, len(updated)))
@command.command("flow.decode")
def decode(self, flows: typing.Sequence[flow.Flow], part: str) -> None:
"""
Decode flows.
"""
updated = []
for f in flows:
p = getattr(f, part, None)
if p:
p.decode()
updated.append(f)
ctx.master.addons.trigger("update", updated)
ctx.log.alert("Decoded %s flows." % len(updated))
@command.command("flow.encode.toggle")
def encode_toggle(self, flows: typing.Sequence[flow.Flow], part: str) -> None:
"""
Toggle flow encoding on and off, using deflate for encoding.
"""
updated = []
for f in flows:
p = getattr(f, part, None)
if p:
current_enc = p.headers.get("content-encoding", "identity")
if current_enc == "identity":
p.encode("deflate")
else:
p.decode()
updated.append(f)
ctx.master.addons.trigger("update", updated)
ctx.log.alert("Toggled encoding on %s flows." % len(updated))
@command.command("flow.encode")
def encode(self, flows: typing.Sequence[flow.Flow], part: str, enc: str) -> None:
"""
Encode flows with a specified encoding.
"""
if enc not in self.encode_options():
raise exceptions.CommandError("Invalid encoding format: %s" % enc)
updated = []
for f in flows:
p = getattr(f, part, None)
if p:
current_enc = p.headers.get("content-encoding", "identity")
if current_enc == "identity":
p.encode(enc)
updated.append(f)
ctx.master.addons.trigger("update", updated)
ctx.log.alert("Encoded %s flows." % len(updated))
@command.command("flow.encode.options")
def encode_options(self) -> typing.Sequence[str]:
"""
The possible values for an encoding specification.
"""
return ["gzip", "deflate", "br"]
@command.command("options.load")
def options_load(self, path: str) -> None:
"""
Load options from a file.
"""
try:
optmanager.load_paths(ctx.options, path)
except (OSError, exceptions.OptionsError) as e:
raise exceptions.CommandError(
"Could not load options - %s" % e
) from e
@command.command("options.save")
def options_save(self, path: str) -> None:
"""
Save options to a file.
"""
try:
optmanager.save(ctx.options, path)
except OSError as e:
raise exceptions.CommandError(
"Could not save options - %s" % e
) from e
@command.command("options.reset")
def options_reset(self) -> None:
"""
Reset all options to defaults.
"""
ctx.options.reset()
@command.command("options.reset.one")
def options_reset_one(self, name: str) -> None:
"""
Reset one option to its default value.
"""
if name not in ctx.options:
raise exceptions.CommandError("No such option: %s" % name)
setattr(
ctx.options,
name,
ctx.options.default(name),
)

View File

@@ -0,0 +1,47 @@
"""
The core addon is responsible for verifying core settings that are not
checked by other addons.
"""
from mitmproxy import exceptions
from mitmproxy import platform
from mitmproxy import ctx
from mitmproxy.net import server_spec
from mitmproxy.utils import human
class CoreOptionValidation:
def configure(self, updated):
opts = ctx.options
if opts.add_upstream_certs_to_client_chain and not opts.upstream_cert:
raise exceptions.OptionsError(
"The no-upstream-cert and add-upstream-certs-to-client-chain "
"options are mutually exclusive. If no-upstream-cert is enabled "
"then the upstream certificate is not retrieved before generating "
"the client certificate chain."
)
if "body_size_limit" in updated and opts.body_size_limit:
try:
opts._processed["body_size_limit"] = human.parse_size(
opts.body_size_limit
)
except ValueError as e:
raise exceptions.OptionsError(
"Invalid body size limit specification: %s" %
opts.body_size_limit
)
if "mode" in updated:
mode = opts.mode
if mode.startswith("reverse:") or mode.startswith("upstream:"):
try:
server_spec.parse_with_mode(mode)
except ValueError as e:
raise exceptions.OptionsError(str(e)) from e
elif mode == "transparent":
if not platform.original_addr:
raise exceptions.OptionsError(
"Transparent mode not supported on this platform."
)
elif mode not in ["regular", "socks5"]:
raise exceptions.OptionsError(
"Invalid mode specification: %s" % mode
)

151
mitmproxy/addons/cut.py Normal file
View File

@@ -0,0 +1,151 @@
import io
import csv
import typing
from mitmproxy import command
from mitmproxy import exceptions
from mitmproxy import flow
from mitmproxy import ctx
from mitmproxy import certs
from mitmproxy.utils import strutils
import pyperclip
def headername(spec: str):
if not (spec.startswith("header[") and spec.endswith("]")):
raise exceptions.CommandError("Invalid header spec: %s" % spec)
return spec[len("header["):-1].strip()
flow_shortcuts = {
"q": "request",
"s": "response",
"cc": "client_conn",
"sc": "server_conn",
}
def is_addr(v):
return isinstance(v, tuple) and len(v) > 1
def extract(cut: str, f: flow.Flow) -> typing.Union[str, bytes]:
path = cut.split(".")
current = f # type: typing.Any
for i, spec in enumerate(path):
if spec.startswith("_"):
raise exceptions.CommandError("Can't access internal attribute %s" % spec)
if isinstance(current, flow.Flow):
spec = flow_shortcuts.get(spec, spec)
part = getattr(current, spec, None)
if i == len(path) - 1:
if spec == "port" and is_addr(current):
return str(current[1])
if spec == "host" and is_addr(current):
return str(current[0])
elif spec.startswith("header["):
return current.headers.get(headername(spec), "")
elif isinstance(part, bytes):
return part
elif isinstance(part, bool):
return "true" if part else "false"
elif isinstance(part, certs.SSLCert):
return part.to_pem().decode("ascii")
current = part
return str(current or "")
def parse_cutspec(s: str) -> typing.Tuple[str, typing.Sequence[str]]:
"""
Returns (flowspec, [cuts]).
Raises exceptions.CommandError if input is invalid.
"""
parts = s.split("|", maxsplit=1)
flowspec = "@all"
if len(parts) == 2:
flowspec = parts[1].strip()
cuts = parts[0]
cutparts = [i.strip() for i in cuts.split(",") if i.strip()]
if len(cutparts) == 0:
raise exceptions.CommandError("Invalid cut specification.")
return flowspec, cutparts
class Cut:
@command.command("cut")
def cut(self, cutspec: str) -> command.Cuts:
"""
Resolve a cut specification of the form "cuts|flowspec". The cuts
are a comma-separated list of cut snippets. Cut snippets are
attribute paths from the base of the flow object, with a few
conveniences - "q", "s", "cc" and "sc" are shortcuts for request,
response, client_conn and server_conn, "port" and "host" retrieve
parts of an address tuple, ".header[key]" retrieves a header value.
Return values converted sensibly: SSL certicates are converted to PEM
format, bools are "true" or "false", "bytes" are preserved, and all
other values are converted to strings. The flowspec is optional, and
if it is not specified, it is assumed to be @all.
"""
flowspec, cuts = parse_cutspec(cutspec)
flows = ctx.master.commands.call_args("view.resolve", [flowspec])
ret = []
for f in flows:
ret.append([extract(c, f) for c in cuts])
return ret
@command.command("cut.save")
def save(self, cuts: command.Cuts, path: str) -> None:
"""
Save cuts to file. If there are multiple rows or columns, the format
is UTF-8 encoded CSV. If there is exactly one row and one column,
the data is written to file as-is, with raw bytes preserved. If the
path is prefixed with a "+", values are appended if there is an
existing file.
"""
append = False
if path.startswith("+"):
append = True
path = path[1:]
if len(cuts) == 1 and len(cuts[0]) == 1:
with open(path, "ab" if append else "wb") as fp:
if fp.tell() > 0:
# We're appending to a file that already exists and has content
fp.write(b"\n")
v = cuts[0][0]
if isinstance(v, bytes):
fp.write(v)
else:
fp.write(v.encode("utf8"))
ctx.log.alert("Saved single cut.")
else:
with open(path, "a" if append else "w", newline='', encoding="utf8") as fp:
writer = csv.writer(fp)
for r in cuts:
writer.writerow(
[strutils.always_str(c) or "" for c in r] # type: ignore
)
ctx.log.alert("Saved %s cuts as CSV." % len(cuts))
@command.command("cut.clip")
def clip(self, cuts: command.Cuts) -> None:
"""
Send cuts to the system clipboard.
"""
fp = io.StringIO(newline="")
if len(cuts) == 1 and len(cuts[0]) == 1:
v = cuts[0][0]
if isinstance(v, bytes):
fp.write(strutils.always_str(v))
else:
fp.write("utf8")
ctx.log.alert("Clipped single cut.")
else:
writer = csv.writer(fp)
for r in cuts:
writer.writerow(
[strutils.always_str(c) or "" for c in r] # type: ignore
)
ctx.log.alert("Clipped %s cuts as CSV." % len(cuts))
pyperclip.copy(fp.getvalue())

View File

@@ -0,0 +1,38 @@
import mitmproxy
class DisableH2C:
"""
We currently only support HTTP/2 over a TLS connection.
Some clients try to upgrade a connection from HTTP/1.1 to h2c. We need to
remove those headers to avoid protocol errors if one endpoints suddenly
starts sending HTTP/2 frames.
Some clients might use HTTP/2 Prior Knowledge to directly initiate a session
by sending the connection preface. We just kill those flows.
"""
def process_flow(self, f):
if f.request.headers.get('upgrade', '') == 'h2c':
mitmproxy.ctx.log.warn("HTTP/2 cleartext connections (h2c upgrade requests) are currently not supported.")
del f.request.headers['upgrade']
if 'connection' in f.request.headers:
del f.request.headers['connection']
if 'http2-settings' in f.request.headers:
del f.request.headers['http2-settings']
is_connection_preface = (
f.request.method == 'PRI' and
f.request.path == '*' and
f.request.http_version == 'HTTP/2.0'
)
if is_connection_preface:
f.kill()
mitmproxy.ctx.log.warn("Initiating HTTP/2 connections with prior knowledge are currently not supported.")
# Handlers
def request(self, f):
self.process_flow(f)

View File

@@ -1,21 +0,0 @@
class DisableH2CleartextUpgrade:
"""
We currently only support HTTP/2 over a TLS connection. Some clients try
to upgrade a connection from HTTP/1.1 to h2c, so we need to remove those
headers to avoid protocol errors if one endpoints suddenly starts sending
HTTP/2 frames.
"""
def process_flow(self, f):
if f.request.headers.get('upgrade', '') == 'h2c':
del f.request.headers['upgrade']
if 'connection' in f.request.headers:
del f.request.headers['connection']
if 'http2-settings' in f.request.headers:
del f.request.headers['http2-settings']
# Handlers
def request(self, f):
self.process_flow(f)

View File

@@ -29,24 +29,18 @@ def colorful(line, styles):
class Dumper:
def __init__(self, outfile=sys.stdout):
self.filter = None # type: flowfilter.TFilter
self.flow_detail = None # type: int
self.outfp = outfile # type: typing.io.TextIO
self.showhost = None # type: bool
self.default_contentview = "auto" # type: str
def configure(self, options, updated):
if "filtstr" in updated:
if options.filtstr:
self.filter = flowfilter.parse(options.filtstr)
def configure(self, updated):
if "view_filter" in updated:
if ctx.options.view_filter:
self.filter = flowfilter.parse(ctx.options.view_filter)
if not self.filter:
raise exceptions.OptionsError(
"Invalid filter expression: %s" % options.filtstr
"Invalid filter expression: %s" % ctx.options.view_filter
)
else:
self.filter = None
self.flow_detail = options.flow_detail
self.showhost = options.showhost
self.default_contentview = options.default_contentview
def echo(self, text, ident=None, **style):
if ident:
@@ -67,13 +61,13 @@ class Dumper:
def _echo_message(self, message):
_, lines, error = contentviews.get_message_content_view(
self.default_contentview,
ctx.options.default_contentview,
message
)
if error:
ctx.log.debug(error)
if self.flow_detail == 3:
if ctx.options.flow_detail == 3:
lines_to_echo = itertools.islice(lines, 70)
else:
lines_to_echo = lines
@@ -95,14 +89,14 @@ class Dumper:
if next(lines, None):
self.echo("(cut off)", ident=4, dim=True)
if self.flow_detail >= 2:
if ctx.options.flow_detail >= 2:
self.echo("")
def _echo_request_line(self, flow):
if flow.client_conn:
client = click.style(
strutils.escape_control_characters(
repr(flow.client_conn.address)
human.format_address(flow.client_conn.address)
)
)
elif flow.request.is_replay:
@@ -121,12 +115,12 @@ class Dumper:
fg=method_color,
bold=True
)
if self.showhost:
if ctx.options.showhost:
url = flow.request.pretty_url
else:
url = flow.request.url
terminalWidthLimit = max(shutil.get_terminal_size()[0] - 25, 50)
if self.flow_detail < 1 and len(url) > terminalWidthLimit:
if ctx.options.flow_detail < 1 and len(url) > terminalWidthLimit:
url = url[:terminalWidthLimit] + ""
url = click.style(strutils.escape_control_characters(url), bold=True)
@@ -176,7 +170,7 @@ class Dumper:
size = click.style(size, bold=True)
arrows = click.style(" <<", bold=True)
if self.flow_detail == 1:
if ctx.options.flow_detail == 1:
# This aligns the HTTP response code with the HTTP request method:
# 127.0.0.1:59519: GET http://example.com/
# << 304 Not Modified 0b
@@ -194,16 +188,16 @@ class Dumper:
def echo_flow(self, f):
if f.request:
self._echo_request_line(f)
if self.flow_detail >= 2:
if ctx.options.flow_detail >= 2:
self._echo_headers(f.request.headers)
if self.flow_detail >= 3:
if ctx.options.flow_detail >= 3:
self._echo_message(f.request)
if f.response:
self._echo_response_line(f)
if self.flow_detail >= 2:
if ctx.options.flow_detail >= 2:
self._echo_headers(f.response.headers)
if self.flow_detail >= 3:
if ctx.options.flow_detail >= 3:
self._echo_message(f.response)
if f.error:
@@ -211,7 +205,7 @@ class Dumper:
self.echo(" << {}".format(msg), bold=True, fg="red")
def match(self, f):
if self.flow_detail == 0:
if ctx.options.flow_detail == 0:
return False
if not self.filter:
return True
@@ -239,7 +233,7 @@ class Dumper:
if self.match(f):
message = f.messages[-1]
self.echo(f.message_info(message))
if self.flow_detail >= 3:
if ctx.options.flow_detail >= 3:
self._echo_message(message)
def websocket_end(self, f):
@@ -267,5 +261,5 @@ class Dumper:
server=repr(f.server_conn.address),
direction=direction,
))
if self.flow_detail >= 3:
if ctx.options.flow_detail >= 3:
self._echo_message(message)

View File

@@ -0,0 +1,75 @@
import typing
from mitmproxy import command
from mitmproxy import flow
from mitmproxy import exceptions
from mitmproxy.utils import strutils
from mitmproxy.net.http.http1 import assemble
import pyperclip
def curl_command(f: flow.Flow) -> str:
if not hasattr(f, "request"):
raise exceptions.CommandError("Can't export flow with no request.")
data = "curl "
request = f.request.copy() # type: ignore
request.decode(strict=False)
for k, v in request.headers.items(multi=True):
data += "-H '%s:%s' " % (k, v)
if request.method != "GET":
data += "-X %s " % request.method
data += "'%s'" % request.url
if request.content:
data += " --data-binary '%s'" % strutils.bytes_to_escaped_str(
request.content,
escape_single_quotes=True
)
return data
def raw(f: flow.Flow) -> bytes:
if not hasattr(f, "request"):
raise exceptions.CommandError("Can't export flow with no request.")
return assemble.assemble_request(f.request) # type: ignore
formats = dict(
curl = curl_command,
raw = raw,
)
class Export():
@command.command("export.formats")
def formats(self) -> typing.Sequence[str]:
"""
Return a list of the supported export formats.
"""
return list(sorted(formats.keys()))
@command.command("export.file")
def file(self, fmt: str, f: flow.Flow, path: str) -> None:
"""
Export a flow to path.
"""
if fmt not in formats:
raise exceptions.CommandError("No such export format: %s" % fmt)
func = formats[fmt] # type: typing.Any
v = func(f)
with open(path, "wb") as fp:
if isinstance(v, bytes):
fp.write(v)
else:
fp.write(v.encode("utf-8"))
@command.command("export.clip")
def clip(self, fmt: str, f: flow.Flow) -> None:
"""
Export a flow to the system clipboard.
"""
if fmt not in formats:
raise exceptions.CommandError("No such export format: %s" % fmt)
func = formats[fmt] # type: typing.Any
v = strutils.always_str(func(f))
pyperclip.copy(v)

View File

@@ -1,20 +1,21 @@
from mitmproxy import flowfilter
from mitmproxy import exceptions
from mitmproxy import ctx
class Intercept:
def __init__(self):
self.filt = None
def configure(self, opts, updated):
def configure(self, updated):
if "intercept" in updated:
if not opts.intercept:
if not ctx.options.intercept:
self.filt = None
return
self.filt = flowfilter.parse(opts.intercept)
self.filt = flowfilter.parse(ctx.options.intercept)
if not self.filt:
raise exceptions.OptionsError(
"Invalid interception filter: %s" % opts.intercept
"Invalid interception filter: %s" % ctx.options.intercept
)
def process_flow(self, f):

View File

@@ -0,0 +1,7 @@
from mitmproxy import ctx
class KeepServing:
def event_processing_complete(self):
if not ctx.master.options.keepserving:
ctx.master.shutdown()

View File

@@ -1,17 +1,18 @@
from mitmproxy.addons import wsgiapp
from mitmproxy.addons.onboardingapp import app
from mitmproxy import ctx
class Onboarding(wsgiapp.WSGIApp):
name = "onboarding"
def __init__(self):
super().__init__(app.Adapter(app.application), None, None)
self.enabled = False
def configure(self, options, updated):
self.host = options.onboarding_host
self.port = options.onboarding_port
self.enabled = options.onboarding
def configure(self, updated):
self.host = ctx.options.onboarding_host
self.port = ctx.options.onboarding_port
def request(self, f):
if self.enabled:
if ctx.options.onboarding:
super().request(f)

View File

@@ -1,7 +1,8 @@
import binascii
import weakref
import ldap3
from typing import Optional
from typing import Set # noqa
from typing import MutableMapping # noqa
from typing import Tuple
import passlib.apache
@@ -10,6 +11,7 @@ import mitmproxy.net.http
from mitmproxy import connections # noqa
from mitmproxy import exceptions
from mitmproxy import http
from mitmproxy import ctx
from mitmproxy.net.http import status_codes
REALM = "mitmproxy"
@@ -45,12 +47,12 @@ class ProxyAuth:
self.nonanonymous = False
self.htpasswd = None
self.singleuser = None
self.mode = None
self.authenticated = weakref.WeakSet() # type: Set[connections.ClientConnection]
self.ldapserver = None
self.authenticated = weakref.WeakKeyDictionary() # type: MutableMapping[connections.ClientConnection, Tuple[str, str]]
"""Contains all connections that are permanently authenticated after an HTTP CONNECT"""
def enabled(self) -> bool:
return any([self.nonanonymous, self.htpasswd, self.singleuser])
return any([self.nonanonymous, self.htpasswd, self.singleuser, self.ldapserver])
def is_proxy_auth(self) -> bool:
"""
@@ -58,7 +60,7 @@ class ProxyAuth:
- True, if authentication is done as if mitmproxy is a proxy
- False, if authentication is done as if mitmproxy is a HTTP server
"""
return self.mode in ("regular", "upstream")
return ctx.options.mode in ("regular", "upstream")
def which_auth_header(self) -> str:
if self.is_proxy_auth():
@@ -99,7 +101,20 @@ class ProxyAuth:
elif self.htpasswd:
if self.htpasswd.check_password(username, password):
return username, password
elif self.ldapserver:
if not username or not password:
return None
dn = ctx.options.proxyauth.split(":")[2]
parts = dn.split("?")
conn = ldap3.Connection(
self.ldapserver,
parts[0] + username + parts[1],
password,
auto_bind=True)
if conn:
conn.search(parts[1][1:], '(' + parts[0] + username + ')', attributes=['objectclass'])
if ctx.options.proxyauth.split(":")[3] in conn.entries[0]['objectclass']:
return username, password
return None
def authenticate(self, f: http.HTTPFlow) -> bool:
@@ -113,39 +128,51 @@ class ProxyAuth:
return False
# Handlers
def configure(self, options, updated):
if "auth_nonanonymous" in updated:
self.nonanonymous = options.auth_nonanonymous
if "auth_singleuser" in updated:
if options.auth_singleuser:
parts = options.auth_singleuser.split(':')
if len(parts) != 2:
raise exceptions.OptionsError(
"Invalid single-user auth specification."
)
self.singleuser = parts
else:
self.singleuser = None
if "auth_htpasswd" in updated:
if options.auth_htpasswd:
try:
self.htpasswd = passlib.apache.HtpasswdFile(
options.auth_htpasswd
)
except (ValueError, OSError) as v:
raise exceptions.OptionsError(
"Could not open htpasswd file: %s" % v
)
else:
self.htpasswd = None
if "mode" in updated:
self.mode = options.mode
def configure(self, updated):
if "proxyauth" in updated:
self.nonanonymous = False
self.singleuser = None
self.htpasswd = None
self.ldapserver = None
if ctx.options.proxyauth:
if ctx.options.proxyauth == "any":
self.nonanonymous = True
elif ctx.options.proxyauth.startswith("@"):
p = ctx.options.proxyauth[1:]
try:
self.htpasswd = passlib.apache.HtpasswdFile(p)
except (ValueError, OSError) as v:
raise exceptions.OptionsError(
"Could not open htpasswd file: %s" % p
)
elif ctx.options.proxyauth.startswith("ldap"):
parts = ctx.options.proxyauth.split(":")
if len(parts) != 4:
raise exceptions.OptionsError(
"Invalid ldap specification"
)
if parts[0] == "ldaps":
server = ldap3.Server(parts[1], use_ssl=True)
elif parts[0] == "ldap":
server = ldap3.Server(parts[1])
else:
raise exceptions.OptionsError(
"Invalid ldap specfication on the first part"
)
self.ldapserver = server
else:
parts = ctx.options.proxyauth.split(':')
if len(parts) != 2:
raise exceptions.OptionsError(
"Invalid single-user auth specification."
)
self.singleuser = parts
if self.enabled():
if options.mode == "transparent":
if ctx.options.mode == "transparent":
raise exceptions.OptionsError(
"Proxy Authentication not supported in transparent mode."
)
if options.mode == "socks5":
if ctx.options.mode == "socks5":
raise exceptions.OptionsError(
"Proxy Authentication not supported in SOCKS mode. "
"https://github.com/mitmproxy/mitmproxy/issues/738"
@@ -155,11 +182,12 @@ class ProxyAuth:
def http_connect(self, f: http.HTTPFlow) -> None:
if self.enabled():
if self.authenticate(f):
self.authenticated.add(f.client_conn)
self.authenticated[f.client_conn] = f.metadata["proxyauth"]
def requestheaders(self, f: http.HTTPFlow) -> None:
if self.enabled():
# Is this connection authenticated by a previous HTTP CONNECT?
if f.client_conn in self.authenticated:
f.metadata["proxyauth"] = self.authenticated[f.client_conn]
return
self.authenticate(f)

View File

@@ -0,0 +1,56 @@
import os.path
import sys
import typing
from mitmproxy import ctx
from mitmproxy import exceptions
from mitmproxy import io
class ReadFile:
"""
An addon that handles reading from file on startup.
"""
def load_flows(self, fo: typing.IO[bytes]) -> int:
cnt = 0
freader = io.FlowReader(fo)
try:
for flow in freader.stream():
ctx.master.load_flow(flow)
cnt += 1
except (IOError, exceptions.FlowReadException) as e:
if cnt:
ctx.log.warn("Flow file corrupted - loaded %i flows." % cnt)
else:
ctx.log.error("Flow file corrupted.")
raise exceptions.FlowReadException(str(e)) from e
else:
return cnt
def load_flows_from_path(self, path: str) -> int:
path = os.path.expanduser(path)
try:
with open(path, "rb") as f:
return self.load_flows(f)
except IOError as e:
ctx.log.error("Cannot load flows: {}".format(e))
raise exceptions.FlowReadException(str(e)) from e
def running(self):
if ctx.options.rfile:
try:
self.load_flows_from_path(ctx.options.rfile)
except exceptions.FlowReadException as e:
raise exceptions.OptionsError(e) from e
finally:
ctx.master.addons.trigger("processing_complete")
class ReadFileStdin(ReadFile):
"""Support the special case of "-" for reading from stdin"""
def load_flows_from_path(self, path: str) -> int:
if path == "-":
return self.load_flows(sys.stdin.buffer)
else:
return super().load_flows_from_path(path)

View File

@@ -1,3 +1,4 @@
import os
import re
from mitmproxy import exceptions
@@ -42,25 +43,22 @@ def parse_hook(s):
return patt, a, b
class _ReplaceBase:
class Replace:
def __init__(self):
self.lst = []
def configure(self, options, updated):
def configure(self, updated):
"""
.replacements is a list of tuples (fpat, rex, s):
fpatt: a string specifying a filter pattern.
rex: a regular expression, as bytes.
s: the replacement string, as bytes
rex: a regular expression, as string.
s: the replacement string
"""
if self.optionName in updated:
if "replacements" in updated:
lst = []
for rep in getattr(options, self.optionName):
if isinstance(rep, str):
fpatt, rex, s = parse_hook(rep)
else:
fpatt, rex, s = rep
for rep in ctx.options.replacements:
fpatt, rex, s = parse_hook(rep)
flt = flowfilter.parse(fpatt)
if not flt:
@@ -68,11 +66,16 @@ class _ReplaceBase:
"Invalid filter pattern: %s" % fpatt
)
try:
# We should ideally escape here before trying to compile
re.compile(rex)
except re.error as e:
raise exceptions.OptionsError(
"Invalid regular expression: %s - %s" % (rex, str(e))
)
if s.startswith("@") and not os.path.isfile(s[1:]):
raise exceptions.OptionsError(
"Invalid file path: {}".format(s[1:])
)
lst.append((rex, s, flt))
self.lst = lst
@@ -92,21 +95,13 @@ class _ReplaceBase:
if not flow.reply.has_message:
self.execute(flow)
class Replace(_ReplaceBase):
optionName = "replacements"
def replace(self, obj, rex, s):
if s.startswith("@"):
s = os.path.expanduser(s[1:])
try:
with open(s, "rb") as f:
s = f.read()
except IOError:
ctx.log.warn("Could not read replacement file: %s" % s)
return
obj.replace(rex, s, flags=re.DOTALL)
class ReplaceFile(_ReplaceBase):
optionName = "replacement_files"
def replace(self, obj, rex, s):
try:
v = open(s, "rb").read()
except IOError as e:
ctx.log.warn("Could not read replacement file: %s" % s)
return
obj.replace(rex, v, flags=re.DOTALL)

93
mitmproxy/addons/save.py Normal file
View File

@@ -0,0 +1,93 @@
import os.path
import typing
from mitmproxy import exceptions
from mitmproxy import flowfilter
from mitmproxy import io
from mitmproxy import ctx
from mitmproxy import flow
class Save:
def __init__(self):
self.stream = None
self.filt = None
self.active_flows = set() # type: Set[flow.Flow]
def open_file(self, path):
if path.startswith("+"):
path = path[1:]
mode = "ab"
else:
mode = "wb"
path = os.path.expanduser(path)
return open(path, mode)
def start_stream_to_path(self, path, flt):
try:
f = self.open_file(path)
except IOError as v:
raise exceptions.OptionsError(str(v))
self.stream = io.FilteredFlowWriter(f, flt)
self.active_flows = set()
def configure(self, updated):
# We're already streaming - stop the previous stream and restart
if "save_stream_filter" in updated:
if ctx.options.save_stream_filter:
self.filt = flowfilter.parse(ctx.options.save_stream_filter)
if not self.filt:
raise exceptions.OptionsError(
"Invalid filter specification: %s" % ctx.options.save_stream_filter
)
else:
self.filt = None
if "save_stream_file" in updated:
if self.stream:
self.done()
if ctx.options.save_stream_file:
self.start_stream_to_path(ctx.options.save_stream_file, self.filt)
def save(self, flows: typing.Sequence[flow.Flow], path: str) -> None:
"""
Save flows to a file. If the path starts with a +, flows are
appended to the file, otherwise it is over-written.
"""
try:
f = self.open_file(path)
except IOError as v:
raise exceptions.CommandError(v) from v
stream = io.FlowWriter(f)
for i in flows:
stream.add(i)
f.close()
ctx.log.alert("Saved %s flows." % len(flows))
def load(self, l):
l.add_command("save.file", self.save)
def tcp_start(self, flow):
if self.stream:
self.active_flows.add(flow)
def tcp_end(self, flow):
if self.stream:
self.stream.add(flow)
self.active_flows.discard(flow)
def response(self, flow):
if self.stream:
self.stream.add(flow)
self.active_flows.discard(flow)
def request(self, flow):
if self.stream:
self.active_flows.add(flow)
def done(self):
if self.stream:
for f in self.active_flows:
self.stream.add(f)
self.active_flows = set([])
self.stream.fo.close()
self.stream = None

View File

@@ -1,269 +1,145 @@
import contextlib
import os
import shlex
import importlib
import time
import sys
import threading
import traceback
import types
import typing
from mitmproxy import addonmanager
from mitmproxy import exceptions
from mitmproxy import ctx
from mitmproxy import flow
from mitmproxy import command
from mitmproxy import eventsequence
from mitmproxy import ctx
import watchdog.events
from watchdog.observers import polling
def parse_command(command):
"""
Returns a (path, args) tuple.
"""
if not command or not command.strip():
raise ValueError("Empty script command.")
# Windows: escape all backslashes in the path.
if os.name == "nt": # pragma: no cover
backslashes = shlex.split(command, posix=False)[0].count("\\")
command = command.replace("\\", "\\\\", backslashes)
args = shlex.split(command) # pragma: no cover
args[0] = os.path.expanduser(args[0])
if not os.path.exists(args[0]):
raise ValueError(
("Script file not found: %s.\r\n"
"If your script path contains spaces, "
"make sure to wrap it in additional quotes, e.g. -s \"'./foo bar/baz.py' --args\".") %
args[0])
elif os.path.isdir(args[0]):
raise ValueError("Not a file: %s" % args[0])
return args[0], args[1:]
def cut_traceback(tb, func_name):
"""
Cut off a traceback at the function with the given name.
The func_name's frame is excluded.
Args:
tb: traceback object, as returned by sys.exc_info()[2]
func_name: function name
Returns:
Reduced traceback.
"""
tb_orig = tb
for _, _, fname, _ in traceback.extract_tb(tb):
tb = tb.tb_next
if fname == func_name:
break
if tb is None:
# We could not find the method, take the full stack trace.
# This may happen on some Python interpreters/flavors (e.g. PyInstaller).
return tb_orig
else:
return tb
@contextlib.contextmanager
def scriptenv(path, args):
oldargs = sys.argv
sys.argv = [path] + args
script_dir = os.path.dirname(os.path.abspath(path))
sys.path.append(script_dir)
def load_script(actx, path):
if not os.path.exists(path):
ctx.log.info("No such file: %s" % path)
return
loader = importlib.machinery.SourceFileLoader(os.path.basename(path), path)
try:
yield
except SystemExit as v:
ctx.log.error("Script exited with code %s" % v.code)
except Exception:
etype, value, tb = sys.exc_info()
tb = cut_traceback(tb, "scriptenv").tb_next
ctx.log.error(
"Script error: %s" % "".join(
traceback.format_exception(etype, value, tb)
)
)
oldpath = sys.path
sys.path.insert(0, os.path.dirname(path))
with addonmanager.safecall():
m = loader.load_module()
if not getattr(m, "name", None):
m.name = path
return m
finally:
sys.argv = oldargs
sys.path.pop()
def load_script(path, args):
with open(path, "rb") as f:
try:
code = compile(f.read(), path, 'exec')
except SyntaxError as e:
ctx.log.error(
"Script error: %s line %s: %s" % (
e.filename, e.lineno, e.msg
)
)
return
ns = {'__file__': os.path.abspath(path)}
with scriptenv(path, args):
exec(code, ns)
return types.SimpleNamespace(**ns)
class ReloadHandler(watchdog.events.FileSystemEventHandler):
def __init__(self, callback):
self.callback = callback
def filter(self, event):
"""
Returns True only when .py file is changed
"""
if event.is_directory:
return False
if os.path.basename(event.src_path).startswith("."):
return False
if event.src_path.endswith(".py"):
return True
return False
def on_modified(self, event):
if self.filter(event):
self.callback()
def on_created(self, event):
if self.filter(event):
self.callback()
sys.path[:] = oldpath
class Script:
"""
An addon that manages a single script.
"""
def __init__(self, command):
self.name = command
ReloadInterval = 2
self.command = command
self.path, self.args = parse_command(command)
def __init__(self, path):
self.name = "scriptmanager:" + path
self.path = path
self.fullpath = os.path.expanduser(path)
self.ns = None
self.observer = None
self.dead = False
self.last_options = None
self.should_reload = threading.Event()
self.last_load = 0
self.last_mtime = 0
if not os.path.isfile(self.fullpath):
raise exceptions.OptionsError("No such script: %s" % path)
for i in eventsequence.Events:
if not hasattr(self, i):
def mkprox():
evt = i
def prox(*args, **kwargs):
self.run(evt, *args, **kwargs)
return prox
setattr(self, i, mkprox())
def run(self, name, *args, **kwargs):
# It's possible for ns to be un-initialised if we failed during
# configure
if self.ns is not None and not self.dead:
func = getattr(self.ns, name, None)
if func:
with scriptenv(self.path, self.args):
return func(*args, **kwargs)
def reload(self):
self.should_reload.set()
def load_script(self):
self.ns = load_script(self.path, self.args)
ret = self.run("start")
if ret:
self.ns = ret
self.run("start")
@property
def addons(self):
return [self.ns] if self.ns else []
def tick(self):
if self.should_reload.is_set():
self.should_reload.clear()
ctx.log.info("Reloading script: %s" % self.name)
self.ns = load_script(self.path, self.args)
self.start()
self.configure(self.last_options, self.last_options.keys())
else:
self.run("tick")
def start(self):
self.load_script()
def configure(self, options, updated):
self.last_options = options
if not self.observer:
self.observer = polling.PollingObserver()
# Bind the handler to the real underlying master object
self.observer.schedule(
ReloadHandler(self.reload),
os.path.dirname(self.path) or "."
)
self.observer.start()
self.run("configure", options, updated)
def done(self):
self.run("done")
self.dead = True
if time.time() - self.last_load > self.ReloadInterval:
mtime = os.stat(self.fullpath).st_mtime
if mtime > self.last_mtime:
ctx.log.info("Loading script: %s" % self.path)
if self.ns:
ctx.master.addons.remove(self.ns)
self.ns = load_script(ctx, self.fullpath)
if self.ns:
# We're already running, so we have to explicitly register and
# configure the addon
ctx.master.addons.register(self.ns)
ctx.master.addons.invoke_addon(self.ns, "running")
ctx.master.addons.invoke_addon(
self.ns,
"configure",
ctx.options.keys()
)
self.last_load = time.time()
self.last_mtime = mtime
class ScriptLoader:
"""
An addon that manages loading scripts from options.
"""
def run_once(self, command, flows):
try:
sc = Script(command)
except ValueError as e:
raise ValueError(str(e))
sc.load_script()
for f in flows:
for evt, o in eventsequence.iterate(f):
sc.run(evt, o)
sc.done()
return sc
def __init__(self):
self.is_running = False
self.addons = []
def configure(self, options, updated):
def running(self):
self.is_running = True
@command.command("script.run")
def script_run(self, flows: typing.Sequence[flow.Flow], path: str) -> None:
"""
Run a script on the specified flows. The script is loaded with
default options, and all lifecycle events for each flow are
simulated.
"""
try:
s = Script(path)
l = addonmanager.Loader(ctx.master)
ctx.master.addons.invoke_addon(s, "load", l)
ctx.master.addons.invoke_addon(s, "configure", ctx.options.keys())
# Script is loaded on the first tick
ctx.master.addons.invoke_addon(s, "tick")
for f in flows:
for evt, arg in eventsequence.iterate(f):
ctx.master.addons.invoke_addon(s, evt, arg)
except exceptions.OptionsError as e:
raise exceptions.CommandError("Error running script: %s" % e) from e
def configure(self, updated):
if "scripts" in updated:
for s in options.scripts:
if options.scripts.count(s) > 1:
for s in ctx.options.scripts:
if ctx.options.scripts.count(s) > 1:
raise exceptions.OptionsError("Duplicate script: %s" % s)
for a in ctx.master.addons.chain[:]:
if isinstance(a, Script) and a.name not in options.scripts:
for a in self.addons[:]:
if a.path not in ctx.options.scripts:
ctx.log.info("Un-loading script: %s" % a.name)
ctx.master.addons.remove(a)
self.addons.remove(a)
# The machinations below are to ensure that:
# - Scripts remain in the same order
# - Scripts are listed directly after the script addon. This is
# needed to ensure that interactions with, for instance, flow
# serialization remains correct.
# - Scripts are not initialized un-necessarily. If only a
# script's order in the script list has changed, it should simply
# be moved.
# script's order in the script list has changed, it is just
# moved.
current = {}
for a in ctx.master.addons.chain[:]:
if isinstance(a, Script):
current[a.name] = a
ctx.master.addons.chain.remove(a)
for a in self.addons:
current[a.path] = a
ordered = []
newscripts = []
for s in options.scripts:
for s in ctx.options.scripts:
if s in current:
ordered.append(current[s])
else:
ctx.log.info("Loading script: %s" % s)
try:
sc = Script(s)
except ValueError as e:
raise exceptions.OptionsError(str(e))
sc = Script(s)
ordered.append(sc)
newscripts.append(sc)
ochain = ctx.master.addons.chain
pos = ochain.index(self)
ctx.master.addons.chain = ochain[:pos + 1] + ordered + ochain[pos + 1:]
self.addons = ordered
for s in newscripts:
ctx.master.addons.startup(s)
ctx.master.addons.register(s)
if self.is_running:
# If we're already running, we configure and tell the addon
# we're up and running.
ctx.master.addons.invoke_addon(s, "running")

View File

@@ -1,29 +1,50 @@
import hashlib
import urllib
import typing
from typing import Any # noqa
from typing import List # noqa
from mitmproxy import ctx
from mitmproxy import flow
from mitmproxy import exceptions
from mitmproxy import io
from mitmproxy import command
class ServerPlayback:
def __init__(self):
self.options = None
self.flowmap = {}
self.stop = False
self.final_flow = None
self.configured = False
def load(self, flows):
@command.command("replay.server")
def load_flows(self, flows: typing.Sequence[flow.Flow]) -> None:
"""
Replay server responses from flows.
"""
self.flowmap = {}
for i in flows:
if i.response:
if i.response: # type: ignore
l = self.flowmap.setdefault(self._hash(i), [])
l.append(i)
ctx.master.addons.trigger("update", [])
def clear(self):
@command.command("replay.server.file")
def load_file(self, path: str) -> None:
try:
flows = io.read_flows_from_paths([path])
except exceptions.FlowReadException as e:
raise exceptions.CommandError(str(e))
self.load_flows(flows)
@command.command("replay.server.stop")
def clear(self) -> None:
"""
Stop server replay.
"""
self.flowmap = {}
ctx.master.addons.trigger("update", [])
def count(self):
return sum([len(i) for i in self.flowmap.values()])
@@ -38,27 +59,27 @@ class ServerPlayback:
queriesArray = urllib.parse.parse_qsl(query, keep_blank_values=True)
key = [str(r.port), str(r.scheme), str(r.method), str(path)] # type: List[Any]
if not self.options.server_replay_ignore_content:
if self.options.server_replay_ignore_payload_params and r.multipart_form:
if not ctx.options.server_replay_ignore_content:
if ctx.options.server_replay_ignore_payload_params and r.multipart_form:
key.extend(
(k, v)
for k, v in r.multipart_form.items(multi=True)
if k.decode(errors="replace") not in self.options.server_replay_ignore_payload_params
if k.decode(errors="replace") not in ctx.options.server_replay_ignore_payload_params
)
elif self.options.server_replay_ignore_payload_params and r.urlencoded_form:
elif ctx.options.server_replay_ignore_payload_params and r.urlencoded_form:
key.extend(
(k, v)
for k, v in r.urlencoded_form.items(multi=True)
if k not in self.options.server_replay_ignore_payload_params
if k not in ctx.options.server_replay_ignore_payload_params
)
else:
key.append(str(r.raw_content))
if not self.options.server_replay_ignore_host:
if not ctx.options.server_replay_ignore_host:
key.append(r.host)
filtered = []
ignore_params = self.options.server_replay_ignore_params or []
ignore_params = ctx.options.server_replay_ignore_params or []
for p in queriesArray:
if p[0] not in ignore_params:
filtered.append(p)
@@ -66,9 +87,9 @@ class ServerPlayback:
key.append(p[0])
key.append(p[1])
if self.options.server_replay_use_headers:
if ctx.options.server_replay_use_headers:
headers = []
for i in self.options.server_replay_use_headers:
for i in ctx.options.server_replay_use_headers:
v = r.headers.get(i)
headers.append((i, v))
key.append(headers)
@@ -83,7 +104,7 @@ class ServerPlayback:
"""
hsh = self._hash(request)
if hsh in self.flowmap:
if self.options.server_replay_nopop:
if ctx.options.server_replay_nopop:
return self.flowmap[hsh][0]
else:
ret = self.flowmap[hsh].pop(0)
@@ -91,20 +112,18 @@ class ServerPlayback:
del self.flowmap[hsh]
return ret
def configure(self, options, updated):
self.options = options
if "server_replay" in updated:
self.clear()
if options.server_replay:
try:
flows = io.read_flows_from_paths(options.server_replay)
except exceptions.FlowReadException as e:
raise exceptions.OptionsError(str(e))
self.load(flows)
def configure(self, updated):
if not self.configured and ctx.options.server_replay:
self.configured = True
try:
flows = io.read_flows_from_paths(ctx.options.server_replay)
except exceptions.FlowReadException as e:
raise exceptions.OptionsError(str(e))
self.load_flows(flows)
def tick(self):
if self.stop and not self.final_flow.live:
ctx.master.shutdown()
ctx.master.addons.trigger("processing_complete")
def request(self, f):
if self.flowmap:
@@ -112,13 +131,13 @@ class ServerPlayback:
if rflow:
response = rflow.response.copy()
response.is_replay = True
if self.options.refresh_server_playback:
if ctx.options.refresh_server_playback:
response.refresh()
f.response = response
if not self.flowmap and not self.options.keepserving:
if not self.flowmap:
self.final_flow = f
self.stop = True
elif self.options.replay_kill_extra:
elif ctx.options.replay_kill_extra:
ctx.log.warn(
"server_playback: killed non-replay request {}".format(
f.request.url

View File

@@ -1,5 +1,6 @@
from mitmproxy import exceptions
from mitmproxy import flowfilter
from mitmproxy import ctx
def parse_setheader(s):
@@ -43,21 +44,11 @@ class SetHeaders:
def __init__(self):
self.lst = []
def configure(self, options, updated):
"""
options.setheaders is a tuple of (fpatt, header, value)
fpatt: String specifying a filter pattern.
header: Header name.
value: Header value string
"""
def configure(self, updated):
if "setheaders" in updated:
self.lst = []
for shead in options.setheaders:
if isinstance(shead, str):
fpatt, header, value = parse_setheader(shead)
else:
fpatt, header, value = shead
for shead in ctx.options.setheaders:
fpatt, header, value = parse_setheader(shead)
flt = flowfilter.parse(fpatt)
if not flt:

View File

@@ -1,5 +1,6 @@
from mitmproxy import exceptions
from mitmproxy import flowfilter
from mitmproxy import ctx
class StickyAuth:
@@ -7,13 +8,13 @@ class StickyAuth:
self.flt = None
self.hosts = {}
def configure(self, options, updated):
def configure(self, updated):
if "stickyauth" in updated:
if options.stickyauth:
flt = flowfilter.parse(options.stickyauth)
if ctx.options.stickyauth:
flt = flowfilter.parse(ctx.options.stickyauth)
if not flt:
raise exceptions.OptionsError(
"stickyauth: invalid filter expression: %s" % options.stickyauth
"stickyauth: invalid filter expression: %s" % ctx.options.stickyauth
)
self.flt = flt
else:

View File

@@ -1,13 +1,14 @@
import collections
from http import cookiejar
from typing import List, Tuple, Dict, Optional # noqa
from mitmproxy import http, flowfilter, ctx, exceptions
from mitmproxy.net.http import cookies
from mitmproxy import exceptions
from mitmproxy import flowfilter
TOrigin = Tuple[str, int, str]
def ckey(attrs, f):
def ckey(attrs: Dict[str, str], f: http.HTTPFlow) -> TOrigin:
"""
Returns a (domain, port, path) tuple.
"""
@@ -20,32 +21,32 @@ def ckey(attrs, f):
return (domain, f.request.port, path)
def domain_match(a, b):
if cookiejar.domain_match(a, b):
def domain_match(a: str, b: str) -> bool:
if cookiejar.domain_match(a, b): # type: ignore
return True
elif cookiejar.domain_match(a, b.strip(".")):
elif cookiejar.domain_match(a, b.strip(".")): # type: ignore
return True
return False
class StickyCookie:
def __init__(self):
self.jar = collections.defaultdict(dict)
self.flt = None
self.jar = collections.defaultdict(dict) # type: Dict[TOrigin, Dict[str, str]]
self.flt = None # type: Optional[flowfilter.TFilter]
def configure(self, options, updated):
def configure(self, updated):
if "stickycookie" in updated:
if options.stickycookie:
flt = flowfilter.parse(options.stickycookie)
if ctx.options.stickycookie:
flt = flowfilter.parse(ctx.options.stickycookie)
if not flt:
raise exceptions.OptionsError(
"stickycookie: invalid filter expression: %s" % options.stickycookie
"stickycookie: invalid filter expression: %s" % ctx.options.stickycookie
)
self.flt = flt
else:
self.flt = None
def response(self, flow):
def response(self, flow: http.HTTPFlow):
if self.flt:
for name, (value, attrs) in flow.response.cookies.items(multi=True):
# FIXME: We now know that Cookie.py screws up some cookies with
@@ -62,24 +63,21 @@ class StickyCookie:
if not self.jar[dom_port_path]:
self.jar.pop(dom_port_path, None)
else:
b = attrs.copy()
b.insert(0, name, value)
self.jar[dom_port_path][name] = b
self.jar[dom_port_path][name] = value
def request(self, flow):
def request(self, flow: http.HTTPFlow):
if self.flt:
l = []
cookie_list = [] # type: List[Tuple[str,str]]
if flowfilter.match(self.flt, flow):
for domain, port, path in self.jar.keys():
for (domain, port, path), c in self.jar.items():
match = [
domain_match(flow.request.host, domain),
flow.request.port == port,
flow.request.path.startswith(path)
]
if all(match):
c = self.jar[(domain, port, path)]
l.extend([cookies.format_cookie_header(c[name].items(multi=True)) for name in c.keys()])
if l:
cookie_list.extend(c.items())
if cookie_list:
# FIXME: we need to formalise this...
flow.request.stickycookie = True
flow.request.headers["cookie"] = "; ".join(l)
flow.metadata["stickycookie"] = True
flow.request.headers["cookie"] = cookies.format_cookie_header(cookie_list)

View File

@@ -1,14 +1,19 @@
from mitmproxy.net.http import http1
from mitmproxy import exceptions
from mitmproxy import ctx
from mitmproxy.utils import human
class StreamBodies:
def __init__(self):
self.max_size = None
def configure(self, options, updated):
self.max_size = options.stream_large_bodies
def configure(self, updated):
if "stream_large_bodies" in updated and ctx.options.stream_large_bodies:
try:
self.max_size = human.parse_size(ctx.options.stream_large_bodies)
except ValueError as e:
raise exceptions.OptionsError(e)
def run(self, f, is_request):
if self.max_size:

View File

@@ -1,68 +0,0 @@
import os.path
from mitmproxy import exceptions
from mitmproxy import flowfilter
from mitmproxy import io
class StreamFile:
def __init__(self):
self.stream = None
self.filt = None
self.active_flows = set() # type: Set[flow.Flow]
def start_stream_to_path(self, path, mode, flt):
path = os.path.expanduser(path)
try:
f = open(path, mode)
except IOError as v:
raise exceptions.OptionsError(str(v))
self.stream = io.FilteredFlowWriter(f, flt)
self.active_flows = set()
def configure(self, options, updated):
# We're already streaming - stop the previous stream and restart
if "filtstr" in updated:
if options.filtstr:
self.filt = flowfilter.parse(options.filtstr)
if not self.filt:
raise exceptions.OptionsError(
"Invalid filter specification: %s" % options.filtstr
)
else:
self.filt = None
if "streamfile" in updated:
if self.stream:
self.done()
if options.streamfile:
if options.streamfile_append:
mode = "ab"
else:
mode = "wb"
self.start_stream_to_path(options.streamfile, mode, self.filt)
def tcp_start(self, flow):
if self.stream:
self.active_flows.add(flow)
def tcp_end(self, flow):
if self.stream:
self.stream.add(flow)
self.active_flows.discard(flow)
def response(self, flow):
if self.stream:
self.stream.add(flow)
self.active_flows.discard(flow)
def request(self, flow):
if self.stream:
self.active_flows.add(flow)
def done(self):
if self.stream:
for flow in self.active_flows:
self.stream.add(flow)
self.active_flows = set([])
self.stream.fo.close()
self.stream = None

View File

@@ -2,23 +2,25 @@ import sys
import click
from mitmproxy import log
from mitmproxy import ctx
# These get over-ridden by the save execution context. Keep them around so we
# can log directly.
realstdout = sys.stdout
realstderr = sys.stderr
class TermLog:
def __init__(self, outfile=None):
self.options = None
self.outfile = outfile
def configure(self, options, updated):
self.options = options
def log(self, e):
if log.log_tier(e.level) == log.log_tier("error"):
outfile = self.outfile or sys.stderr
outfile = self.outfile or realstderr
else:
outfile = self.outfile or sys.stdout
outfile = self.outfile or realstdout
if self.options.verbosity >= log.log_tier(e.level):
if ctx.options.verbosity >= log.log_tier(e.level):
click.secho(
e.msg,
file=outfile,

View File

@@ -0,0 +1,17 @@
from mitmproxy import ctx
from mitmproxy.utils import human
"""
A tiny addon to print the proxy status to terminal. Eventually this could
also print some stats on exit.
"""
class TermStatus:
def running(self):
if ctx.options.server:
ctx.log.info(
"Proxy server listening at http://{}".format(
human.format_address(ctx.master.server.address)
)
)

View File

@@ -2,6 +2,7 @@ import re
import base64
from mitmproxy import exceptions
from mitmproxy import ctx
from mitmproxy.utils import strutils
@@ -26,20 +27,17 @@ class UpstreamAuth():
"""
def __init__(self):
self.auth = None
self.root_mode = None
def configure(self, options, updated):
def configure(self, updated):
# FIXME: We're doing this because our proxy core is terminally confused
# at the moment. Ideally, we should be able to check if we're in
# reverse proxy mode at the HTTP layer, so that scripts can put the
# proxy in reverse proxy mode for specific reuests.
if "mode" in updated:
self.root_mode = options.mode
if "upstream_auth" in updated:
if options.upstream_auth is None:
if ctx.options.upstream_auth is None:
self.auth = None
else:
self.auth = parse_upstream_auth(options.upstream_auth)
self.auth = parse_upstream_auth(ctx.options.upstream_auth)
def http_connect(self, f):
if self.auth and f.mode == "upstream":
@@ -49,5 +47,5 @@ class UpstreamAuth():
if self.auth:
if f.mode == "upstream" and not f.server_conn.via:
f.request.headers["Proxy-Authorization"] = self.auth
elif self.root_mode == "reverse":
elif ctx.options.mode == "reverse":
f.request.headers["Proxy-Authorization"] = self.auth

View File

@@ -18,6 +18,11 @@ import sortedcontainers
import mitmproxy.flow
from mitmproxy import flowfilter
from mitmproxy import exceptions
from mitmproxy import command
from mitmproxy import connections
from mitmproxy import ctx
from mitmproxy import io
from mitmproxy import http # noqa
# The underlying sorted list implementation expects the sort key to be stable
# for the lifetime of the object. However, if we sort by size, for instance,
@@ -34,7 +39,7 @@ class _OrderKey:
def __init__(self, view):
self.view = view
def generate(self, f: mitmproxy.flow.Flow) -> typing.Any: # pragma: no cover
def generate(self, f: http.HTTPFlow) -> typing.Any: # pragma: no cover
pass
def refresh(self, f):
@@ -64,22 +69,22 @@ class _OrderKey:
class OrderRequestStart(_OrderKey):
def generate(self, f: mitmproxy.flow.Flow) -> datetime.datetime:
def generate(self, f: http.HTTPFlow) -> datetime.datetime:
return f.request.timestamp_start or 0
class OrderRequestMethod(_OrderKey):
def generate(self, f: mitmproxy.flow.Flow) -> str:
def generate(self, f: http.HTTPFlow) -> str:
return f.request.method
class OrderRequestURL(_OrderKey):
def generate(self, f: mitmproxy.flow.Flow) -> str:
def generate(self, f: http.HTTPFlow) -> str:
return f.request.url
class OrderKeySize(_OrderKey):
def generate(self, f: mitmproxy.flow.Flow) -> int:
def generate(self, f: http.HTTPFlow) -> int:
s = 0
if f.request.raw_content:
s += len(f.request.raw_content)
@@ -109,16 +114,16 @@ class View(collections.Sequence):
self.default_order = OrderRequestStart(self)
self.orders = dict(
time = self.default_order,
method = OrderRequestMethod(self),
url = OrderRequestURL(self),
size = OrderKeySize(self),
time = OrderRequestStart(self), method = OrderRequestMethod(self),
url = OrderRequestURL(self), size = OrderKeySize(self),
)
self.order_key = self.default_order
self.order_reversed = False
self.focus_follow = False
self._view = sortedcontainers.SortedListWithKey(key = self.order_key)
self._view = sortedcontainers.SortedListWithKey(
key = self.order_key
)
# The sig_view* signals broadcast events that affect the view. That is,
# an update to a flow in the store but not in the view does not trigger
@@ -165,7 +170,7 @@ class View(collections.Sequence):
def __len__(self):
return len(self._view)
def __getitem__(self, offset) -> mitmproxy.flow.Flow:
def __getitem__(self, offset) -> typing.Any:
return self._view[self._rev(offset)]
# Reflect some methods to the efficient underlying implementation
@@ -177,7 +182,7 @@ class View(collections.Sequence):
def index(self, f: mitmproxy.flow.Flow, start: int = 0, stop: typing.Optional[int] = None) -> int:
return self._rev(self._view.index(f, start, stop))
def __contains__(self, f: mitmproxy.flow.Flow) -> bool:
def __contains__(self, f: typing.Any) -> bool:
return self._view.__contains__(f)
def _order_key_name(self):
@@ -197,7 +202,36 @@ class View(collections.Sequence):
self.sig_view_refresh.send(self)
# API
def toggle_marked(self):
@command.command("view.focus.next")
def focus_next(self) -> None:
"""
Set focus to the next flow.
"""
idx = self.focus.index + 1
if self.inbounds(idx):
self.focus.flow = self[idx]
@command.command("view.focus.prev")
def focus_prev(self) -> None:
"""
Set focus to the previous flow.
"""
idx = self.focus.index - 1
if self.inbounds(idx):
self.focus.flow = self[idx]
@command.command("view.order.options")
def order_options(self) -> typing.Sequence[str]:
"""
A list of all the orders we support.
"""
return list(sorted(self.orders.keys()))
@command.command("view.marked.toggle")
def toggle_marked(self) -> None:
"""
Toggle whether to show marked views only.
"""
self.show_marked = not self.show_marked
self._refilter()
@@ -221,7 +255,7 @@ class View(collections.Sequence):
self.filter = flt or matchall
self._refilter()
def clear(self):
def clear(self) -> None:
"""
Clears both the store and view.
"""
@@ -241,55 +275,19 @@ class View(collections.Sequence):
self._refilter()
self.sig_store_refresh.send(self)
def add(self, f: mitmproxy.flow.Flow) -> bool:
def add(self, flows: typing.Sequence[mitmproxy.flow.Flow]) -> None:
"""
Adds a flow to the state. If the flow already exists, it is
ignored.
"""
if f.id not in self._store:
self._store[f.id] = f
if self.filter(f):
self._base_add(f)
if self.focus_follow:
self.focus.flow = f
self.sig_view_add.send(self, flow=f)
def remove(self, f: mitmproxy.flow.Flow):
"""
Removes the flow from the underlying store and the view.
"""
if f.id in self._store:
if f in self._view:
self._view.remove(f)
self.sig_view_remove.send(self, flow=f)
del self._store[f.id]
self.sig_store_remove.send(self, flow=f)
def update(self, f: mitmproxy.flow.Flow):
"""
Updates a flow. If the flow is not in the state, it's ignored.
"""
if f.id in self._store:
if self.filter(f):
if f not in self._view:
for f in flows:
if f.id not in self._store:
self._store[f.id] = f
if self.filter(f):
self._base_add(f)
if self.focus_follow:
self.focus.flow = f
self.sig_view_add.send(self, flow=f)
else:
# This is a tad complicated. The sortedcontainers
# implementation assumes that the order key is stable. If
# it changes mid-way Very Bad Things happen. We detect when
# this happens, and re-fresh the item.
self.order_key.refresh(f)
self.sig_view_update.send(self, flow=f)
else:
try:
self._view.remove(f)
self.sig_view_remove.send(self, flow=f)
except ValueError:
# The value was not in the view
pass
def get_by_id(self, flow_id: str) -> typing.Optional[mitmproxy.flow.Flow]:
"""
@@ -298,48 +296,199 @@ class View(collections.Sequence):
"""
return self._store.get(flow_id)
@command.command("view.getval")
def getvalue(self, f: mitmproxy.flow.Flow, key: str, default: str) -> str:
"""
Get a value from the settings store for the specified flow.
"""
return self.settings[f].get(key, default)
@command.command("view.setval.toggle")
def setvalue_toggle(
self,
flows: typing.Sequence[mitmproxy.flow.Flow],
key: str
) -> None:
"""
Toggle a boolean value in the settings store, seting the value to
the string "true" or "false".
"""
updated = []
for f in flows:
current = self.settings[f].get("key", "false")
self.settings[f][key] = "false" if current == "true" else "true"
updated.append(f)
ctx.master.addons.trigger("update", updated)
@command.command("view.setval")
def setvalue(
self,
flows: typing.Sequence[mitmproxy.flow.Flow],
key: str, value: str
) -> None:
"""
Set a value in the settings store for the specified flows.
"""
updated = []
for f in flows:
self.settings[f][key] = value
updated.append(f)
ctx.master.addons.trigger("update", updated)
@command.command("view.load")
def load_file(self, path: str) -> None:
"""
Load flows into the view, without processing them with addons.
"""
for i in io.FlowReader(open(path, "rb")).stream():
# Do this to get a new ID, so we can load the same file N times and
# get new flows each time. It would be more efficient to just have a
# .newid() method or something.
self.add([i.copy()])
@command.command("view.go")
def go(self, dst: int) -> None:
"""
Go to a specified offset. Positive offests are from the beginning of
the view, negative from the end of the view, so that 0 is the first
flow, -1 is the last flow.
"""
if len(self) == 0:
return
if dst < 0:
dst = len(self) + dst
if dst < 0:
dst = 0
if dst > len(self) - 1:
dst = len(self) - 1
self.focus.flow = self[dst]
@command.command("view.duplicate")
def duplicate(self, flows: typing.Sequence[mitmproxy.flow.Flow]) -> None:
"""
Duplicates the specified flows, and sets the focus to the first
duplicate.
"""
dups = [f.copy() for f in flows]
if dups:
self.add(dups)
self.focus.flow = dups[0]
ctx.log.alert("Duplicated %s flows" % len(dups))
@command.command("view.remove")
def remove(self, flows: typing.Sequence[mitmproxy.flow.Flow]) -> None:
"""
Removes the flow from the underlying store and the view.
"""
for f in flows:
if f.id in self._store:
if f.killable:
f.kill()
if f in self._view:
self._view.remove(f)
self.sig_view_remove.send(self, flow=f)
del self._store[f.id]
self.sig_store_remove.send(self, flow=f)
@command.command("view.resolve")
def resolve(self, spec: str) -> typing.Sequence[mitmproxy.flow.Flow]:
"""
Resolve a flow list specification to an actual list of flows.
"""
if spec == "@all":
return [i for i in self._store.values()]
if spec == "@focus":
return [self.focus.flow] if self.focus.flow else []
elif spec == "@shown":
return [i for i in self]
elif spec == "@hidden":
return [i for i in self._store.values() if i not in self._view]
elif spec == "@marked":
return [i for i in self._store.values() if i.marked]
elif spec == "@unmarked":
return [i for i in self._store.values() if not i.marked]
else:
filt = flowfilter.parse(spec)
if not filt:
raise exceptions.CommandError("Invalid flow filter: %s" % spec)
return [i for i in self._store.values() if filt(i)]
@command.command("view.create")
def create(self, method: str, url: str) -> None:
req = http.HTTPRequest.make(method.upper(), url)
c = connections.ClientConnection.make_dummy(("", 0))
s = connections.ServerConnection.make_dummy((req.host, req.port))
f = http.HTTPFlow(c, s)
f.request = req
f.request.headers["Host"] = req.host
self.add([f])
# Event handlers
def configure(self, opts, updated):
if "filter" in updated:
def configure(self, updated):
if "view_filter" in updated:
filt = None
if opts.filter:
filt = flowfilter.parse(opts.filter)
if ctx.options.view_filter:
filt = flowfilter.parse(ctx.options.view_filter)
if not filt:
raise exceptions.OptionsError(
"Invalid interception filter: %s" % opts.filter
"Invalid interception filter: %s" % ctx.options.view_filter
)
self.set_filter(filt)
if "console_order" in updated:
if opts.console_order is None:
self.set_order(self.default_order)
else:
if opts.console_order not in self.orders:
raise exceptions.OptionsError(
"Unknown flow order: %s" % opts.console_order
)
self.set_order(self.orders[opts.console_order])
if ctx.options.console_order not in self.orders:
raise exceptions.OptionsError(
"Unknown flow order: %s" % ctx.options.console_order
)
self.set_order(self.orders[ctx.options.console_order])
if "console_order_reversed" in updated:
self.set_reversed(opts.console_order_reversed)
self.set_reversed(ctx.options.console_order_reversed)
if "console_focus_follow" in updated:
self.focus_follow = opts.console_focus_follow
self.focus_follow = ctx.options.console_focus_follow
def request(self, f):
self.add(f)
self.add([f])
def error(self, f):
self.update(f)
self.update([f])
def response(self, f):
self.update(f)
self.update([f])
def intercept(self, f):
self.update(f)
self.update([f])
def resume(self, f):
self.update(f)
self.update([f])
def kill(self, f):
self.update(f)
self.update([f])
def update(self, flows: typing.Sequence[mitmproxy.flow.Flow]) -> None:
"""
Updates a list of flows. If flow is not in the state, it's ignored.
"""
for f in flows:
if f.id in self._store:
if self.filter(f):
if f not in self._view:
self._base_add(f)
if self.focus_follow:
self.focus.flow = f
self.sig_view_add.send(self, flow=f)
else:
# This is a tad complicated. The sortedcontainers
# implementation assumes that the order key is stable. If
# it changes mid-way Very Bad Things happen. We detect when
# this happens, and re-fresh the item.
self.order_key.refresh(f)
self.sig_view_update.send(self, flow=f)
else:
try:
self._view.remove(f)
self.sig_view_remove.send(self, flow=f)
except ValueError:
# The value was not in the view
pass
class Focus:
@@ -371,6 +520,7 @@ class Focus:
def index(self) -> typing.Optional[int]:
if self.flow:
return self.view.index(self.flow)
return None
@index.setter
def index(self, idx):
@@ -404,7 +554,7 @@ class Focus:
class Settings(collections.Mapping):
def __init__(self, view: View) -> None:
self.view = view
self._values = {} # type: typing.MutableMapping[str, mitmproxy.flow.Flow]
self._values = {} # type: typing.MutableMapping[str, typing.Dict]
view.sig_store_remove.connect(self._sig_store_remove)
view.sig_store_refresh.connect(self._sig_store_refresh)

View File

@@ -13,6 +13,10 @@ class WSGIApp:
def __init__(self, app, host, port):
self.app, self.host, self.port = app, host, port
@property
def name(self):
return "wsgiapp:%s:%s" % (self.host, self.port)
def serve(self, app, flow):
"""
Serves app on flow, and prevents further handling of the flow.

View File

@@ -93,9 +93,9 @@ def dummy_cert(privkey, cacert, commonname, sans):
try:
ipaddress.ip_address(i.decode("ascii"))
except ValueError:
ss.append(b"DNS: %s" % i)
ss.append(b"DNS:%s" % i)
else:
ss.append(b"IP: %s" % i)
ss.append(b"IP:%s" % i)
ss = b", ".join(ss)
cert = OpenSSL.crypto.X509()
@@ -356,14 +356,14 @@ class CertStore:
class _GeneralName(univ.Choice):
# We are only interested in dNSNames. We use a default handler to ignore
# other types.
# TODO: We should also handle iPAddresses.
# We only care about dNSName and iPAddress
componentType = namedtype.NamedTypes(
namedtype.NamedType('dNSName', char.IA5String().subtype(
implicitTag=tag.Tag(tag.tagClassContext, tag.tagFormatSimple, 2)
)
),
)),
namedtype.NamedType('iPAddress', univ.OctetString().subtype(
implicitTag=tag.Tag(tag.tagClassContext, tag.tagFormatSimple, 7)
)),
)
@@ -384,9 +384,6 @@ class SSLCert(serializable.Serializable):
def __eq__(self, other):
return self.digest("sha256") == other.digest("sha256")
def __ne__(self, other):
return not self.__eq__(other)
def get_state(self):
return self.to_pem()
@@ -477,5 +474,10 @@ class SSLCert(serializable.Serializable):
except PyAsn1Error:
continue
for i in dec[0]:
altnames.append(i[0].asOctets())
if i[0] is None and isinstance(i[1], univ.OctetString) and not isinstance(i[1], char.IA5String):
# This would give back the IP address: b'.'.join([str(e).encode() for e in i[1].asNumbers()])
continue
else:
e = i[0].asOctets()
altnames.append(e)
return altnames

195
mitmproxy/command.py Normal file
View File

@@ -0,0 +1,195 @@
"""
This module manges and invokes typed commands.
"""
import inspect
import typing
import shlex
import textwrap
import functools
import sys
from mitmproxy.utils import typecheck
from mitmproxy import exceptions
from mitmproxy import flow
Cuts = typing.Sequence[
typing.Sequence[typing.Union[str, bytes]]
]
def typename(t: type, ret: bool) -> str:
"""
Translates a type to an explanatory string. If ret is True, we're
looking at a return type, else we're looking at a parameter type.
"""
if issubclass(t, (str, int, bool)):
return t.__name__
elif t == typing.Sequence[flow.Flow]:
return "[flow]" if ret else "flowspec"
elif t == typing.Sequence[str]:
return "[str]"
elif t == Cuts:
return "[cuts]" if ret else "cutspec"
elif t == flow.Flow:
return "flow"
else: # pragma: no cover
raise NotImplementedError(t)
class Command:
def __init__(self, manager, path, func) -> None:
self.path = path
self.manager = manager
self.func = func
sig = inspect.signature(self.func)
self.help = None
if func.__doc__:
txt = func.__doc__.strip()
self.help = "\n".join(textwrap.wrap(txt))
self.has_positional = False
for i in sig.parameters.values():
# This is the kind for *args paramters
if i.kind == i.VAR_POSITIONAL:
self.has_positional = True
self.paramtypes = [v.annotation for v in sig.parameters.values()]
self.returntype = sig.return_annotation
def paramnames(self) -> typing.Sequence[str]:
v = [typename(i, False) for i in self.paramtypes]
if self.has_positional:
v[-1] = "*" + v[-1][1:-1]
return v
def retname(self) -> str:
return typename(self.returntype, True) if self.returntype else ""
def signature_help(self) -> str:
params = " ".join(self.paramnames())
ret = self.retname()
if ret:
ret = " -> " + ret
return "%s %s%s" % (self.path, params, ret)
def call(self, args: typing.Sequence[str]):
"""
Call the command with a set of arguments. At this point, all argumets are strings.
"""
if not self.has_positional and (len(self.paramtypes) != len(args)):
raise exceptions.CommandError("Usage: %s" % self.signature_help())
remainder = [] # type: typing.Sequence[str]
if self.has_positional:
remainder = args[len(self.paramtypes) - 1:]
args = args[:len(self.paramtypes) - 1]
pargs = []
for i in range(len(args)):
if typecheck.check_command_type(args[i], self.paramtypes[i]):
pargs.append(args[i])
else:
pargs.append(parsearg(self.manager, args[i], self.paramtypes[i]))
if remainder:
if typecheck.check_command_type(remainder, self.paramtypes[-1]):
pargs.extend(remainder)
else:
raise exceptions.CommandError("Invalid value type.")
with self.manager.master.handlecontext():
ret = self.func(*pargs)
if not typecheck.check_command_type(ret, self.returntype):
raise exceptions.CommandError("Command returned unexpected data")
return ret
class CommandManager:
def __init__(self, master):
self.master = master
self.commands = {}
def collect_commands(self, addon):
for i in dir(addon):
if not i.startswith("__"):
o = getattr(addon, i)
if hasattr(o, "command_path"):
self.add(o.command_path, o)
def add(self, path: str, func: typing.Callable):
self.commands[path] = Command(self, path, func)
def call_args(self, path, args):
"""
Call a command using a list of string arguments. May raise CommandError.
"""
if path not in self.commands:
raise exceptions.CommandError("Unknown command: %s" % path)
return self.commands[path].call(args)
def call(self, cmdstr: str):
"""
Call a command using a string. May raise CommandError.
"""
parts = shlex.split(cmdstr)
if not len(parts) >= 1:
raise exceptions.CommandError("Invalid command: %s" % cmdstr)
return self.call_args(parts[0], parts[1:])
def dump(self, out=sys.stdout) -> None:
cmds = list(self.commands.values())
cmds.sort(key=lambda x: x.signature_help())
for c in cmds:
for hl in (c.help or "").splitlines():
print("# " + hl, file=out)
print(c.signature_help(), file=out)
print(file=out)
def parsearg(manager: CommandManager, spec: str, argtype: type) -> typing.Any:
"""
Convert a string to a argument to the appropriate type.
"""
if issubclass(argtype, str):
return spec
elif argtype == bool:
if spec == "true":
return True
elif spec == "false":
return False
else:
raise exceptions.CommandError(
"Booleans are 'true' or 'false', got %s" % spec
)
elif issubclass(argtype, int):
try:
return int(spec)
except ValueError as e:
raise exceptions.CommandError("Expected an integer, got %s." % spec)
elif argtype == typing.Sequence[flow.Flow]:
return manager.call_args("view.resolve", [spec])
elif argtype == Cuts:
return manager.call_args("cut", [spec])
elif argtype == flow.Flow:
flows = manager.call_args("view.resolve", [spec])
if len(flows) != 1:
raise exceptions.CommandError(
"Command requires one flow, specification matched %s." % len(flows)
)
return flows[0]
elif argtype == typing.Sequence[str]:
return [i.strip() for i in spec.split(",")]
else:
raise exceptions.CommandError("Unsupported argument type: %s" % argtype)
def command(path):
def decorator(function):
@functools.wraps(function)
def wrapper(*args, **kwargs):
return function(*args, **kwargs)
wrapper.__dict__["command_path"] = path
return wrapper
return decorator

View File

@@ -1,6 +1,7 @@
import time
import os
import uuid
from mitmproxy import stateobject
from mitmproxy import certs
@@ -17,6 +18,7 @@ class ClientConnection(tcp.BaseHandler, stateobject.StateObject):
address: Remote address
ssl_established: True if TLS is established, False otherwise
clientcert: The TLS client certificate
mitmcert: The MITM'ed TLS server certificate presented to the client
timestamp_start: Connection start timestamp
timestamp_ssl_setup: TLS established timestamp
timestamp_end: Connection end timestamp
@@ -40,6 +42,8 @@ class ClientConnection(tcp.BaseHandler, stateobject.StateObject):
self.clientcert = None
self.ssl_established = None
self.id = str(uuid.uuid4())
self.mitmcert = None
self.timestamp_start = time.time()
self.timestamp_end = None
self.timestamp_ssl_setup = None
@@ -52,26 +56,47 @@ class ClientConnection(tcp.BaseHandler, stateobject.StateObject):
return bool(self.connection) and not self.finished
def __repr__(self):
if self.ssl_established:
tls = "[{}] ".format(self.tls_version)
else:
tls = ""
if self.alpn_proto_negotiated:
alpn = "[ALPN: {}] ".format(
strutils.bytes_to_escaped_str(self.alpn_proto_negotiated)
)
else:
alpn = ""
return "<ClientConnection: {ssl}{alpn}{address}>".format(
ssl="[ssl] " if self.ssl_established else "",
return "<ClientConnection: {tls}{alpn}{host}:{port}>".format(
tls=tls,
alpn=alpn,
address=repr(self.address)
host=self.address[0],
port=self.address[1],
)
def __eq__(self, other):
if isinstance(other, ClientConnection):
return self.id == other.id
return False
def __hash__(self):
return hash(self.id)
@property
def tls_established(self):
return self.ssl_established
@tls_established.setter
def tls_established(self, value):
self.ssl_established = value
_stateobject_attributes = dict(
address=tcp.Address,
id=str,
address=tuple,
ssl_established=bool,
clientcert=certs.SSLCert,
mitmcert=certs.SSLCert,
timestamp_start=float,
timestamp_ssl_setup=float,
timestamp_end=float,
@@ -96,8 +121,10 @@ class ClientConnection(tcp.BaseHandler, stateobject.StateObject):
@classmethod
def make_dummy(cls, address):
return cls.from_state(dict(
address=dict(address=address, use_ipv6=False),
id=str(uuid.uuid4()),
address=address,
clientcert=None,
mitmcert=None,
ssl_established=False,
timestamp_start=None,
timestamp_end=None,
@@ -108,9 +135,10 @@ class ClientConnection(tcp.BaseHandler, stateobject.StateObject):
tls_version=None,
))
def convert_to_ssl(self, *args, **kwargs):
super().convert_to_ssl(*args, **kwargs)
def convert_to_ssl(self, cert, *args, **kwargs):
super().convert_to_ssl(cert, *args, **kwargs)
self.timestamp_ssl_setup = time.time()
self.mitmcert = cert
sni = self.connection.get_servername()
if sni:
self.sni = sni.decode("idna")
@@ -138,6 +166,7 @@ class ServerConnection(tcp.TCPClient, stateobject.StateObject):
cert: The certificate presented by the remote during the TLS handshake
sni: Server Name Indication sent by the proxy during the TLS handshake
alpn_proto_negotiated: The negotiated application protocol
tls_version: TLS version
via: The underlying server connection (e.g. the connection to the upstream proxy in upstream proxy mode)
timestamp_start: Connection start timestamp
timestamp_tcp_setup: TCP ACK received timestamp
@@ -148,7 +177,9 @@ class ServerConnection(tcp.TCPClient, stateobject.StateObject):
def __init__(self, address, source_address=None, spoof_source_address=None):
tcp.TCPClient.__init__(self, address, source_address, spoof_source_address)
self.id = str(uuid.uuid4())
self.alpn_proto_negotiated = None
self.tls_version = None
self.via = None
self.timestamp_start = None
self.timestamp_end = None
@@ -160,35 +191,50 @@ class ServerConnection(tcp.TCPClient, stateobject.StateObject):
def __repr__(self):
if self.ssl_established and self.sni:
ssl = "[ssl: {0}] ".format(self.sni)
tls = "[{}: {}] ".format(self.tls_version or "TLS", self.sni)
elif self.ssl_established:
ssl = "[ssl] "
tls = "[{}] ".format(self.tls_version or "TLS")
else:
ssl = ""
tls = ""
if self.alpn_proto_negotiated:
alpn = "[ALPN: {}] ".format(
strutils.bytes_to_escaped_str(self.alpn_proto_negotiated)
)
else:
alpn = ""
return "<ServerConnection: {ssl}{alpn}{address}>".format(
ssl=ssl,
return "<ServerConnection: {tls}{alpn}{host}:{port}>".format(
tls=tls,
alpn=alpn,
address=repr(self.address)
host=self.address[0],
port=self.address[1],
)
def __eq__(self, other):
if isinstance(other, ServerConnection):
return self.id == other.id
return False
def __hash__(self):
return hash(self.id)
@property
def tls_established(self):
return self.ssl_established
@tls_established.setter
def tls_established(self, value):
self.ssl_established = value
_stateobject_attributes = dict(
address=tcp.Address,
ip_address=tcp.Address,
source_address=tcp.Address,
id=str,
address=tuple,
ip_address=tuple,
source_address=tuple,
ssl_established=bool,
cert=certs.SSLCert,
sni=str,
alpn_proto_negotiated=bytes,
tls_version=str,
timestamp_start=float,
timestamp_tcp_setup=float,
timestamp_ssl_setup=float,
@@ -204,12 +250,14 @@ class ServerConnection(tcp.TCPClient, stateobject.StateObject):
@classmethod
def make_dummy(cls, address):
return cls.from_state(dict(
address=dict(address=address, use_ipv6=False),
ip_address=dict(address=address, use_ipv6=False),
id=str(uuid.uuid4()),
address=address,
ip_address=address,
cert=None,
sni=None,
alpn_proto_negotiated=None,
source_address=dict(address=('', 0), use_ipv6=False),
tls_version=None,
source_address=('', 0),
ssl_established=False,
timestamp_start=None,
timestamp_tcp_setup=None,
@@ -239,13 +287,14 @@ class ServerConnection(tcp.TCPClient, stateobject.StateObject):
else:
path = os.path.join(
clientcerts,
self.address.host.encode("idna").decode()) + ".pem"
self.address[0].encode("idna").decode()) + ".pem"
if os.path.exists(path):
clientcert = path
self.convert_to_ssl(cert=clientcert, sni=sni, **kwargs)
self.sni = sni
self.alpn_proto_negotiated = self.get_alpn_proto_negotiated()
self.tls_version = self.connection.get_protocol_version_name()
self.timestamp_ssl_setup = time.time()
def finish(self):

View File

@@ -36,12 +36,14 @@ def get(name: str) -> Optional[View]:
for i in views:
if i.name.lower() == name.lower():
return i
return None
def get_by_shortcut(c: str) -> Optional[View]:
for i in views:
if i.prompt[1] == c:
return i
return None
def add(view: View) -> None:

View File

@@ -37,7 +37,7 @@ def parse_gif(data: bytes) -> Metadata:
descriptor = img.logical_screen_descriptor
parts = [
('Format', 'Compuserve GIF'),
('Version', "GIF{}".format(img.header.version.decode('ASCII'))),
('Version', "GIF{}".format(img.hdr.version)),
('Size', "{} x {} px".format(descriptor.screen_width, descriptor.screen_height)),
('background', str(descriptor.bg_color_index))
]

View File

@@ -128,6 +128,7 @@ def is_inline_text(a: Token, b: Token, c: Token) -> bool:
if isinstance(a, Tag) and isinstance(b, Text) and isinstance(c, Tag):
if a.is_opening and "\n" not in b.data and c.is_closing and a.tag == c.tag:
return True
return False
def is_inline(prev2: Token, prev1: Token, t: Token, next1: Token, next2: Token) -> bool:
@@ -140,6 +141,7 @@ def is_inline(prev2: Token, prev1: Token, t: Token, next1: Token, next2: Token)
return True # <div></div> (start tag)
if isinstance(prev1, Tag) and prev1.is_opening and t.is_closing and prev1.tag == t.tag:
return True # <div></div> (end tag)
return False
class ElementStack:

View File

@@ -1,16 +1,20 @@
# This is a generated file! Please edit source .ksy file and use kaitai-struct-compiler to rebuild
# The source was exif.ksy from here - https://github.com/kaitai-io/kaitai_struct_formats/blob/24e2d00048b8084ceec30a187a79cb87a79a48ba/image/exif.ksy
import array
import struct
import zlib
from enum import Enum
from pkg_resources import parse_version
from kaitaistruct import KaitaiStruct, KaitaiStream, BytesIO
from kaitaistruct import __version__ as ks_version, KaitaiStruct, KaitaiStream, BytesIO
if parse_version(ks_version) < parse_version('0.7'):
raise Exception("Incompatible Kaitai Struct Python API: 0.7 or later is required, but you have %s" % (ks_version))
from .exif_le import ExifLe
from .exif_be import ExifBe
class Exif(KaitaiStruct):
def __init__(self, _io, _parent=None, _root=None):
self._io = _io

View File

@@ -1,14 +1,17 @@
# This is a generated file! Please edit source .ksy file and use kaitai-struct-compiler to rebuild
# The source was exif_be.ksy from here - https://github.com/kaitai-io/kaitai_struct_formats/blob/24e2d00048b8084ceec30a187a79cb87a79a48ba/image/exif_be.ksy
import array
import struct
import zlib
from enum import Enum
from pkg_resources import parse_version
from kaitaistruct import KaitaiStruct, KaitaiStream, BytesIO
from kaitaistruct import __version__ as ks_version, KaitaiStruct, KaitaiStream, BytesIO
if parse_version(ks_version) < parse_version('0.7'):
raise Exception("Incompatible Kaitai Struct Python API: 0.7 or later is required, but you have %s" % (ks_version))
class ExifBe(KaitaiStruct):
def __init__(self, _io, _parent=None, _root=None):
self._io = _io
@@ -569,3 +572,5 @@ class ExifBe(KaitaiStruct):
self._m_ifd0 = self._root.Ifd(self._io, self, self._root)
self._io.seek(_pos)
return self._m_ifd0 if hasattr(self, '_m_ifd0') else None

View File

@@ -1,14 +1,17 @@
# This is a generated file! Please edit source .ksy file and use kaitai-struct-compiler to rebuild
# The source was exif_le.ksy from here - https://github.com/kaitai-io/kaitai_struct_formats/blob/24e2d00048b8084ceec30a187a79cb87a79a48ba/image/exif_le.ksy
import array
import struct
import zlib
from enum import Enum
from pkg_resources import parse_version
from kaitaistruct import KaitaiStruct, KaitaiStream, BytesIO
from kaitaistruct import __version__ as ks_version, KaitaiStruct, KaitaiStream, BytesIO
if parse_version(ks_version) < parse_version('0.7'):
raise Exception("Incompatible Kaitai Struct Python API: 0.7 or later is required, but you have %s" % (ks_version))
class ExifLe(KaitaiStruct):
def __init__(self, _io, _parent=None, _root=None):
self._io = _io
@@ -569,3 +572,5 @@ class ExifLe(KaitaiStruct):
self._m_ifd0 = self._root.Ifd(self._io, self, self._root)
self._io.seek(_pos)
return self._m_ifd0 if hasattr(self, '_m_ifd0') else None

View File

@@ -1,14 +1,17 @@
# This is a generated file! Please edit source .ksy file and use kaitai-struct-compiler to rebuild
# The source was png.ksy from here - https://github.com/kaitai-io/kaitai_struct_formats/blob/562154250bea0081fed4e232751b934bc270a0c7/image/gif.ksy
import array
import struct
import zlib
from enum import Enum
from pkg_resources import parse_version
from kaitaistruct import KaitaiStruct, KaitaiStream, BytesIO
from kaitaistruct import __version__ as ks_version, KaitaiStruct, KaitaiStream, BytesIO
if parse_version(ks_version) < parse_version('0.7'):
raise Exception("Incompatible Kaitai Struct Python API: 0.7 or later is required, but you have %s" % (ks_version))
class Gif(KaitaiStruct):
class BlockType(Enum):
@@ -24,8 +27,8 @@ class Gif(KaitaiStruct):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self.header = self._root.Header(self._io, self, self._root)
self.logical_screen_descriptor = self._root.LogicalScreenDescriptor(self._io, self, self._root)
self.hdr = self._root.Header(self._io, self, self._root)
self.logical_screen_descriptor = self._root.LogicalScreenDescriptorStruct(self._io, self, self._root)
if self.logical_screen_descriptor.has_color_table:
self._raw_global_color_table = self._io.read_bytes((self.logical_screen_descriptor.color_table_size * 3))
io = KaitaiStream(BytesIO(self._raw_global_color_table))
@@ -55,7 +58,7 @@ class Gif(KaitaiStruct):
self.blue = self._io.read_u1()
class LogicalScreenDescriptor(KaitaiStruct):
class LogicalScreenDescriptorStruct(KaitaiStruct):
def __init__(self, _io, _parent=None, _root=None):
self._io = _io
self._parent = _parent
@@ -163,7 +166,7 @@ class Gif(KaitaiStruct):
self._parent = _parent
self._root = _root if _root else self
self.magic = self._io.ensure_fixed_contents(struct.pack('3b', 71, 73, 70))
self.version = self._io.read_bytes(3)
self.version = (self._io.read_bytes(3)).decode(u"ASCII")
class ExtGraphicControl(KaitaiStruct):
@@ -245,3 +248,6 @@ class Gif(KaitaiStruct):
self.body = self._root.ExtGraphicControl(self._io, self, self._root)
else:
self.body = self._root.Subblocks(self._io, self, self._root)

View File

@@ -1,15 +1,19 @@
# This is a generated file! Please edit source .ksy file and use kaitai-struct-compiler to rebuild
# The source was jpeg.ksy from here - https://github.com/kaitai-io/kaitai_struct_formats/blob/24e2d00048b8084ceec30a187a79cb87a79a48ba/image/jpeg.ksy
import array
import struct
import zlib
from enum import Enum
from pkg_resources import parse_version
from kaitaistruct import KaitaiStruct, KaitaiStream, BytesIO
from kaitaistruct import __version__ as ks_version, KaitaiStruct, KaitaiStream, BytesIO
if parse_version(ks_version) < parse_version('0.7'):
raise Exception("Incompatible Kaitai Struct Python API: 0.7 or later is required, but you have %s" % (ks_version))
from .exif import Exif
class Jpeg(KaitaiStruct):
class ComponentId(Enum):
@@ -127,7 +131,7 @@ class Jpeg(KaitaiStruct):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self.magic = self._io.read_strz("ASCII", 0, False, True, True)
self.magic = (self._io.read_bytes_term(0, False, True, True)).decode(u"ASCII")
_on = self.magic
if _on == u"Exif":
self.body = self._root.ExifInJpeg(self._io, self, self._root)
@@ -195,7 +199,7 @@ class Jpeg(KaitaiStruct):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self.magic = self._io.read_str_byte_limit(5, "ASCII")
self.magic = (self._io.read_bytes(5)).decode(u"ASCII")
self.version_major = self._io.read_u1()
self.version_minor = self._io.read_u1()
self.density_units = self._root.SegmentApp0.DensityUnit(self._io.read_u1())

View File

@@ -0,0 +1,10 @@
#!/usr/bin/env bash
wget -N https://raw.githubusercontent.com/kaitai-io/kaitai_struct_formats/master/image/exif_be.ksy
wget -N https://raw.githubusercontent.com/kaitai-io/kaitai_struct_formats/master/image/exif_le.ksy
wget -N https://raw.githubusercontent.com/kaitai-io/kaitai_struct_formats/master/image/exif.ksy
wget -N https://raw.githubusercontent.com/kaitai-io/kaitai_struct_formats/master/image/gif.ksy
wget -N https://raw.githubusercontent.com/kaitai-io/kaitai_struct_formats/master/image/jpeg.ksy
wget -N https://raw.githubusercontent.com/kaitai-io/kaitai_struct_formats/master/image/png.ksy
kaitai-struct-compiler --target python --opaque-types=true *.ksy

View File

@@ -1,14 +1,17 @@
# This is a generated file! Please edit source .ksy file and use kaitai-struct-compiler to rebuild
# The source was png.ksy from here - https://github.com/kaitai-io/kaitai_struct_formats/blob/9370c720b7d2ad329102d89bdc880ba6a706ef26/image/png.ksy
import array
import struct
import zlib
from enum import Enum
from pkg_resources import parse_version
from kaitaistruct import KaitaiStruct, KaitaiStream, BytesIO
from kaitaistruct import __version__ as ks_version, KaitaiStruct, KaitaiStream, BytesIO
if parse_version(ks_version) < parse_version('0.7'):
raise Exception("Incompatible Kaitai Struct Python API: 0.7 or later is required, but you have %s" % (ks_version))
class Png(KaitaiStruct):
class ColorType(Enum):
@@ -51,7 +54,7 @@ class Png(KaitaiStruct):
self._parent = _parent
self._root = _root if _root else self
self.len = self._io.read_u4be()
self.type = self._io.read_str_byte_limit(4, "UTF-8")
self.type = (self._io.read_bytes(4)).decode(u"UTF-8")
_on = self.type
if _on == u"iTXt":
self._raw_body = self._io.read_bytes(self.len)
@@ -194,7 +197,7 @@ class Png(KaitaiStruct):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self.keyword = self._io.read_strz("UTF-8", 0, False, True, True)
self.keyword = (self._io.read_bytes_term(0, False, True, True)).decode(u"UTF-8")
self.compression_method = self._io.read_u1()
self._raw_text_datastream = self._io.read_bytes_full()
self.text_datastream = zlib.decompress(self._raw_text_datastream)
@@ -259,12 +262,12 @@ class Png(KaitaiStruct):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self.keyword = self._io.read_strz("UTF-8", 0, False, True, True)
self.keyword = (self._io.read_bytes_term(0, False, True, True)).decode(u"UTF-8")
self.compression_flag = self._io.read_u1()
self.compression_method = self._io.read_u1()
self.language_tag = self._io.read_strz("ASCII", 0, False, True, True)
self.translated_keyword = self._io.read_strz("UTF-8", 0, False, True, True)
self.text = self._io.read_str_eos("UTF-8")
self.language_tag = (self._io.read_bytes_term(0, False, True, True)).decode(u"ASCII")
self.translated_keyword = (self._io.read_bytes_term(0, False, True, True)).decode(u"UTF-8")
self.text = (self._io.read_bytes_full()).decode(u"UTF-8")
class TextChunk(KaitaiStruct):
@@ -272,8 +275,8 @@ class Png(KaitaiStruct):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self.keyword = self._io.read_strz("iso8859-1", 0, False, True, True)
self.text = self._io.read_str_eos("iso8859-1")
self.keyword = (self._io.read_bytes_term(0, False, True, True)).decode(u"iso8859-1")
self.text = (self._io.read_bytes_full()).decode(u"iso8859-1")
class TimeChunk(KaitaiStruct):
@@ -287,3 +290,6 @@ class Png(KaitaiStruct):
self.hour = self._io.read_u1()
self.minute = self._io.read_u1()
self.second = self._io.read_u1()

View File

@@ -41,7 +41,7 @@ class ASCommandResponse:
raise ValueError("Empty WBXML body passed")
except Exception as e:
self.xmlString = None
raise ValueError("Error: {0}".format(e.message))
raise ValueError("Error: {0}".format(e))
def getWBXMLBytes(self):
return self.wbxmlBytes

View File

@@ -40,7 +40,7 @@ class ASWBXMLByteQueue(Queue):
Queue.__init__(self)
for byte in wbxmlBytes:
self.put(ord(byte))
self.put(byte)
self.bytesEnqueued += 1

View File

@@ -39,12 +39,12 @@ class ASWBXMLCodePage:
self.tagLookup[tag] = token
def getToken(self, tag):
if self.tagLookup.has_key(tag):
if tag in self.tagLookup:
return self.tagLookup[tag]
return 0xFF
def getTag(self, token):
if self.tokenLookup.has_key(token):
if token in self.tokenLookup:
return self.tokenLookup[token]
return None

View File

@@ -1,4 +1,3 @@
import functools
import queue
from mitmproxy import exceptions
@@ -14,8 +13,8 @@ class Channel:
def ask(self, mtype, m):
"""
Decorate a message with a reply attribute, and send it to the
master. Then wait for a response.
Decorate a message with a reply attribute, and send it to the master.
Then wait for a response.
Raises:
exceptions.Kill: All connections should be closed immediately.
@@ -36,84 +35,42 @@ class Channel:
def tell(self, mtype, m):
"""
Decorate a message with a dummy reply attribute, send it to the
master, then return immediately.
Decorate a message with a dummy reply attribute, send it to the master,
then return immediately.
"""
m.reply = DummyReply()
self.q.put((mtype, m))
def handler(f):
@functools.wraps(f)
def wrapper(master, message):
if not hasattr(message, "reply"):
raise exceptions.ControlException("Message %s has no reply attribute" % message)
# DummyReplys may be reused multiple times.
# We only clear them up on the next handler so that we can access value and
# state in the meantime.
if isinstance(message.reply, DummyReply):
message.reply.reset()
# The following ensures that inheritance with wrapped handlers in the
# base class works. If we're the first handler, then responsibility for
# acking is ours. If not, it's someone else's and we ignore it.
handling = False
# We're the first handler - ack responsibility is ours
if message.reply.state == "unhandled":
handling = True
message.reply.handle()
with master.handlecontext():
ret = f(master, message)
if handling:
master.addons(f.__name__, message)
# Reset the handled flag - it's common for us to feed the same object
# through handlers repeatedly, so we don't want this to persist across
# calls.
if handling and message.reply.state == "handled":
message.reply.take()
if not message.reply.has_message:
message.reply.ack()
message.reply.commit()
# DummyReplys may be reused multiple times.
if isinstance(message.reply, DummyReply):
message.reply.mark_reset()
return ret
# Mark this function as a handler wrapper
wrapper.__dict__["__handler"] = True
return wrapper
NO_REPLY = object() # special object we can distinguish from a valid "None" reply.
class Reply:
"""
Messages sent through a channel are decorated with a "reply" attribute.
This object is used to respond to the message through the return
channel.
Messages sent through a channel are decorated with a "reply" attribute. This
object is used to respond to the message through the return channel.
"""
def __init__(self, obj):
self.obj = obj
self.q = queue.Queue() # type: queue.Queue
self._state = "unhandled" # "unhandled" -> "handled" -> "taken" -> "committed"
self.value = NO_REPLY # holds the reply value. May change before things are actually commited.
self._state = "start" # "start" -> "taken" -> "committed"
# Holds the reply value. May change before things are actually commited.
self.value = NO_REPLY
@property
def state(self):
"""
The state the reply is currently in. A normal reply object goes sequentially through the following lifecycle:
The state the reply is currently in. A normal reply object goes
sequentially through the following lifecycle:
1. unhandled: Initial State.
2. handled: The reply object has been handled by the topmost handler function.
3. taken: The reply object has been taken to be commited.
4. committed: The reply has been sent back to the requesting party.
1. start: Initial State.
2. taken: The reply object has been taken to be commited.
3. committed: The reply has been sent back to the requesting party.
This attribute is read-only and can only be modified by calling one of state transition functions.
This attribute is read-only and can only be modified by calling one of
state transition functions.
"""
return self._state
@@ -121,47 +78,43 @@ class Reply:
def has_message(self):
return self.value != NO_REPLY
def handle(self):
"""
Reply are handled by controller.handlers, which may be nested. The first handler takes
responsibility and handles the reply.
"""
if self.state != "unhandled":
raise exceptions.ControlException("Reply is {}, but expected it to be unhandled.".format(self.state))
self._state = "handled"
def take(self):
"""
Scripts or other parties make "take" a reply out of a normal flow.
For example, intercepted flows are taken out so that the connection thread does not proceed.
"""
if self.state != "handled":
raise exceptions.ControlException("Reply is {}, but expected it to be handled.".format(self.state))
if self.state != "start":
raise exceptions.ControlException(
"Reply is {}, but expected it to be start.".format(self.state)
)
self._state = "taken"
def commit(self):
"""
Ultimately, messages are commited. This is done either automatically by the handler
if the message is not taken or manually by the entity which called .take().
Ultimately, messages are commited. This is done either automatically by
if the message is not taken or manually by the entity which called
.take().
"""
if self.state != "taken":
raise exceptions.ControlException("Reply is {}, but expected it to be taken.".format(self.state))
raise exceptions.ControlException(
"Reply is {}, but expected it to be taken.".format(self.state)
)
if not self.has_message:
raise exceptions.ControlException("There is no reply message.")
self._state = "committed"
self.q.put(self.value)
def ack(self, force=False):
if self.state not in {"start", "taken"}:
raise exceptions.ControlException(
"Reply is {}, but expected it to be start or taken.".format(self.state)
)
self.send(self.obj, force)
def kill(self, force=False):
self.send(exceptions.Kill, force)
def send(self, msg, force=False):
if self.state not in ("handled", "taken"):
raise exceptions.ControlException(
"Reply is {}, did not expect a call to .send().".format(self.state)
)
if self.has_message and not force:
raise exceptions.ControlException("There is already a reply message.")
self.value = msg
@@ -175,7 +128,7 @@ class Reply:
class DummyReply(Reply):
"""
A reply object that is not connected to anything. In contrast to regular
Reply objects, DummyReply objects are reset to "unhandled" at the end of an
Reply objects, DummyReply objects are reset to "start" at the end of an
handler so that they can be used multiple times. Useful when we need an
object to seem like it has a channel, and during testing.
"""
@@ -190,7 +143,7 @@ class DummyReply(Reply):
def reset(self):
if self._should_reset:
self._state = "unhandled"
self._state = "start"
self.value = NO_REPLY
def __del__(self):

View File

@@ -1,2 +1,7 @@
import mitmproxy.master # noqa
import mitmproxy.log # noqa
import mitmproxy.options # noqa
master = None # type: "mitmproxy.master.Master"
log = None # type: "mitmproxy.log.Log"
options = None # type: "mitmproxy.options.Options"

View File

@@ -32,8 +32,10 @@ Events = frozenset([
"configure",
"done",
"log",
"start",
"load",
"running",
"tick",
"update",
])

View File

@@ -93,15 +93,22 @@ class SetServerNotAllowedException(MitmproxyException):
pass
class CommandError(Exception):
pass
class OptionsError(MitmproxyException):
pass
class AddonError(MitmproxyException):
class AddonManagerError(MitmproxyException):
pass
class AddonHalt(MitmproxyException):
"""
Raised by addons to signal that no further handlers should handle this event.
"""
pass

View File

@@ -1,188 +0,0 @@
import io
import json
import pprint
import re
import textwrap
from typing import Any
from mitmproxy import http
def _native(s):
if isinstance(s, bytes):
return s.decode()
return s
def dictstr(items, indent: str) -> str:
lines = []
for k, v in items:
lines.append(indent + "%s: %s,\n" % (repr(_native(k)), repr(_native(v))))
return "{\n%s}\n" % "".join(lines)
def curl_command(flow: http.HTTPFlow) -> str:
data = "curl "
request = flow.request.copy()
request.decode(strict=False)
for k, v in request.headers.items(multi=True):
data += "-H '%s:%s' " % (k, v)
if request.method != "GET":
data += "-X %s " % request.method
data += "'%s'" % request.url
if request.content:
data += " --data-binary '%s'" % _native(request.content)
return data
def python_arg(arg: str, val: Any) -> str:
if not val:
return ""
if arg:
arg += "="
arg_str = "{}{},\n".format(
arg,
pprint.pformat(val, 79 - len(arg))
)
return textwrap.indent(arg_str, " " * 4)
def python_code(flow: http.HTTPFlow):
code = io.StringIO()
def writearg(arg, val):
code.write(python_arg(arg, val))
code.write("import requests\n")
code.write("\n")
if flow.request.method.lower() in ("get", "post", "put", "head", "delete", "patch"):
code.write("response = requests.{}(\n".format(flow.request.method.lower()))
else:
code.write("response = requests.request(\n")
writearg("", flow.request.method)
url_without_query = flow.request.url.split("?", 1)[0]
writearg("", url_without_query)
writearg("params", list(flow.request.query.fields))
headers = flow.request.headers.copy()
# requests adds those by default.
for x in (":authority", "host", "content-length"):
headers.pop(x, None)
writearg("headers", dict(headers))
try:
if "json" not in flow.request.headers.get("content-type", ""):
raise ValueError()
writearg("json", json.loads(flow.request.text))
except ValueError:
writearg("data", flow.request.content)
code.seek(code.tell() - 2) # remove last comma
code.write("\n)\n")
code.write("\n")
code.write("print(response.text)")
return code.getvalue()
def locust_code(flow):
code = textwrap.dedent("""
from locust import HttpLocust, TaskSet, task
class UserBehavior(TaskSet):
def on_start(self):
''' on_start is called when a Locust start before any task is scheduled '''
self.{name}()
@task()
def {name}(self):
url = self.locust.host + '{path}'
{headers}{params}{data}
self.response = self.client.request(
method='{method}',
url=url,{args}
)
### Additional tasks can go here ###
class WebsiteUser(HttpLocust):
task_set = UserBehavior
min_wait = 1000
max_wait = 3000
""").strip()
name = re.sub('\W|^(?=\d)', '_', flow.request.path.strip("/").split("?", 1)[0])
if not name:
new_name = "_".join([str(flow.request.host), str(flow.request.timestamp_start)])
name = re.sub('\W|^(?=\d)', '_', new_name)
path_without_query = flow.request.path.split("?")[0]
args = ""
headers = ""
if flow.request.headers:
lines = [
(_native(k), _native(v)) for k, v in flow.request.headers.fields
if _native(k).lower() not in [":authority", "host", "cookie"]
]
lines = [" '%s': '%s',\n" % (k, v) for k, v in lines]
headers += "\n headers = {\n%s }\n" % "".join(lines)
args += "\n headers=headers,"
params = ""
if flow.request.query:
lines = [
" %s: %s,\n" % (repr(k), repr(v))
for k, v in
flow.request.query.collect()
]
params = "\n params = {\n%s }\n" % "".join(lines)
args += "\n params=params,"
data = ""
if flow.request.content:
data = "\n data = '''%s'''\n" % _native(flow.request.content)
args += "\n data=data,"
code = code.format(
name=name,
path=path_without_query,
headers=headers,
params=params,
data=data,
method=flow.request.method,
args=args,
)
return code
def locust_task(flow):
code = locust_code(flow)
start_task = len(code.split('@task')[0]) - 4
end_task = -19 - len(code.split('### Additional')[1])
task_code = code[start_task:end_task]
return task_code
def url(flow):
return flow.request.url
EXPORTERS = [
("content", "c", None),
("headers+content", "h", None),
("url", "u", url),
("as curl command", "r", curl_command),
("as python code", "p", python_code),
("as locust code", "l", locust_code),
("as locust task", "t", locust_task),
]

View File

@@ -78,7 +78,7 @@ class Flow(stateobject.StateObject):
self._backup = None # type: typing.Optional[Flow]
self.reply = None # type: typing.Optional[controller.Reply]
self.marked = False # type: bool
self.metadata = dict() # type: typing.Dict[str, str]
self.metadata = dict() # type: typing.Dict[str, typing.Any]
_stateobject_attributes = dict(
id=str,
@@ -93,7 +93,7 @@ class Flow(stateobject.StateObject):
def get_state(self):
d = super().get_state()
d.update(version=version.IVERSION)
d.update(version=version.FLOW_FORMAT_VERSION)
if self._backup and self._backup != d:
d.update(backup=self._backup)
return d
@@ -112,8 +112,9 @@ class Flow(stateobject.StateObject):
def copy(self):
f = super().copy()
f.id = str(uuid.uuid4())
f.live = False
if self.reply is not None:
f.reply = controller.DummyReply()
return f
def modified(self):
@@ -143,7 +144,7 @@ class Flow(stateobject.StateObject):
@property
def killable(self):
return self.reply and self.reply.state in {"handled", "taken"}
return self.reply and self.reply.state == "taken"
def kill(self):
"""
@@ -151,8 +152,9 @@ class Flow(stateobject.StateObject):
"""
self.error = Error("Connection killed")
self.intercepted = False
# reply.state should only be "handled" or "taken" here.
# if none of this is the case, .take() will raise an exception.
# reply.state should be "taken" here, or .take() will raise an
# exception.
if self.reply.state != "taken":
self.reply.take()
self.reply.kill(force=True)

View File

@@ -44,7 +44,7 @@ from mitmproxy import flow
from mitmproxy.utils import strutils
import pyparsing as pp
from typing import Callable
from typing import Callable, Sequence, Type # noqa
def only(*types):
@@ -69,6 +69,8 @@ class _Token:
class _Action(_Token):
code = None # type: str
help = None # type: str
@classmethod
def make(klass, s, loc, toks):
@@ -162,15 +164,14 @@ def _check_content_type(rex, message):
class FAsset(_Action):
code = "a"
help = "Match asset in response: CSS, Javascript, Flash, images."
ASSET_TYPES = [
ASSET_TYPES = [re.compile(x) for x in [
b"text/javascript",
b"application/x-javascript",
b"application/javascript",
b"text/css",
b"image/.*",
b"application/x-shockwave-flash"
]
ASSET_TYPES = [re.compile(x) for x in ASSET_TYPES]
]]
@only(http.HTTPFlow)
def __call__(self, f):
@@ -319,10 +320,14 @@ class FDomain(_Rex):
code = "d"
help = "Domain"
flags = re.IGNORECASE
is_binary = False
@only(http.HTTPFlow)
def __call__(self, f):
return bool(self.re.search(f.request.data.host))
return bool(
self.re.search(f.request.host) or
self.re.search(f.request.pretty_host)
)
class FUrl(_Rex):
@@ -339,7 +344,7 @@ class FUrl(_Rex):
@only(http.HTTPFlow)
def __call__(self, f):
return self.re.search(f.request.url)
return self.re.search(f.request.pretty_url)
class FSrc(_Rex):
@@ -348,7 +353,10 @@ class FSrc(_Rex):
is_binary = False
def __call__(self, f):
return f.client_conn.address and self.re.search(repr(f.client_conn.address))
if not f.client_conn or not f.client_conn.address:
return False
r = "{}:{}".format(f.client_conn.address[0], f.client_conn.address[1])
return f.client_conn.address and self.re.search(r)
class FDst(_Rex):
@@ -357,7 +365,10 @@ class FDst(_Rex):
is_binary = False
def __call__(self, f):
return f.server_conn.address and self.re.search(repr(f.server_conn.address))
if not f.server_conn or not f.server_conn.address:
return False
r = "{}:{}".format(f.server_conn.address[0], f.server_conn.address[1])
return f.server_conn.address and self.re.search(r)
class _Int(_Action):
@@ -425,7 +436,8 @@ filter_unary = [
FReq,
FResp,
FTCP,
]
FWebSocket,
] # type: Sequence[Type[_Action]]
filter_rex = [
FBod,
FBodRequest,
@@ -441,7 +453,7 @@ filter_rex = [
FMethod,
FSrc,
FUrl,
]
] # type: Sequence[Type[_Rex]]
filter_int = [
FCode
]
@@ -527,17 +539,17 @@ def match(flt, flow):
help = []
for i in filter_unary:
for a in filter_unary:
help.append(
("~%s" % i.code, i.help)
("~%s" % a.code, a.help)
)
for i in filter_rex:
for b in filter_rex:
help.append(
("~%s regex" % i.code, i.help)
("~%s regex" % b.code, b.help)
)
for i in filter_int:
for c in filter_int:
help.append(
("~%s int" % i.code, i.help)
("~%s int" % c.code, c.help)
)
help.sort()
help.extend(

View File

@@ -5,7 +5,6 @@ from mitmproxy import flow
from mitmproxy.net import http
from mitmproxy import version
from mitmproxy.net import tcp
from mitmproxy import connections # noqa
@@ -53,9 +52,7 @@ class HTTPRequest(http.Request):
def get_state(self):
state = super().get_state()
state.update(
is_replay=self.is_replay
)
state["is_replay"] = self.is_replay
return state
def set_state(self, state):
@@ -168,11 +165,12 @@ class HTTPFlow(flow.Flow):
""" What mode was the proxy layer in when receiving this request? """
_stateobject_attributes = flow.Flow._stateobject_attributes.copy()
_stateobject_attributes.update(
# mypy doesn't support update with kwargs
_stateobject_attributes.update(dict(
request=HTTPRequest,
response=HTTPResponse,
mode=str
)
))
def __repr__(self):
s = "<HTTPFlow"
@@ -224,8 +222,7 @@ def make_error_response(
status_code=status_code,
reason=reason,
message=html.escape(message),
)
body = body.encode("utf8", "replace")
).encode("utf8", "replace")
if not headers:
headers = http.Headers(
@@ -245,9 +242,8 @@ def make_error_response(
def make_connect_request(address):
address = tcp.Address.wrap(address)
return HTTPRequest(
"authority", b"CONNECT", None, address.host, address.port, None, b"HTTP/1.1",
"authority", b"CONNECT", None, address[0], address[1], None, b"HTTP/1.1",
http.Headers(), b""
)

7
mitmproxy/io/__init__.py Normal file
View File

@@ -0,0 +1,7 @@
from .io import FlowWriter, FlowReader, FilteredFlowWriter, read_flows_from_paths
__all__ = [
"FlowWriter", "FlowReader", "FilteredFlowWriter", "read_flows_from_paths"
]

Some files were not shown because too many files have changed in this diff Show More