forked from extern/httpie-cli
Fix encoding error with non-prettified encoded responses (#1168)
* Fix encoding error with non-prettified encoded responses
Removed `--format-option response.as` an promote `--response-as`: using
the format option would be misleading as it is now also used by non-prettified
responses.
* Encoding refactoring
* split --response-as into --response-mime and --response-charset
* add support for Content-Type charset for requests printed to terminal
* add support charset detection for requests printed to terminal without a Content-Type charset
* etc.
* `test_unicode.py` → `test_encoding.py`
* Drop sequence length check
* Clean-up tests
* [skip ci] Tweaks
* Use the compatible release clause for `charset_normalizer` requirement
Cf. https://www.python.org/dev/peps/pep-0440/#version-specifiers
* Clean-up
* Partially revert d52a4833e4
* Changelog
* Tweak tests
* [skip ci] Better test name
* Cleanup tests and add request body charset detection
* More test suite cleanups
* Cleanup
* Fix code style in test
* Improve detect_encoding() docstring
* Uniformize pytest.mark.parametrize() calls
* [skip ci] Comment out TODOs (will be tackled in a specific PR)
Co-authored-by: Jakub Roztocil <jakub@roztocil.co>
This commit is contained in:
parent
7989e438d2
commit
4f1c9441c5
@ -6,9 +6,9 @@ This project adheres to [Semantic Versioning](https://semver.org/).
|
|||||||
## [2.6.0.dev0](https://github.com/httpie/httpie/compare/2.5.0...master) (unreleased)
|
## [2.6.0.dev0](https://github.com/httpie/httpie/compare/2.5.0...master) (unreleased)
|
||||||
|
|
||||||
- Added support for formatting & coloring of JSON bodies preceded by non-JSON data (e.g., an XXSI prefix). ([#1130](https://github.com/httpie/httpie/issues/1130))
|
- Added support for formatting & coloring of JSON bodies preceded by non-JSON data (e.g., an XXSI prefix). ([#1130](https://github.com/httpie/httpie/issues/1130))
|
||||||
- Added `--format-options=response.as:CONTENT_TYPE` to allow overriding the response `Content-Type`. ([#1134](https://github.com/httpie/httpie/issues/1134))
|
- Added `--response-encoding` to allow overriding the response encoding for terminal display purposes. ([#1168](https://github.com/httpie/httpie/issues/1168))
|
||||||
- Added `--response-as` shortcut for setting the response `Content-Type`-related `--format-options`. ([#1134](https://github.com/httpie/httpie/issues/1134))
|
- Added `--response-mime` to allow overriding the response mime type for coloring and formatting for the terminal. ([#1168](https://github.com/httpie/httpie/issues/1168))
|
||||||
- Improved handling of prettified responses without correct `Content-Type` encoding. ([#1110](https://github.com/httpie/httpie/issues/1110))
|
- Improved handling of responses with incorrect `Content-Type`. ([#1110](https://github.com/httpie/httpie/issues/1110), [#1168](https://github.com/httpie/httpie/issues/1168))
|
||||||
- Installed plugins are now listed in `--debug` output. ([#1165](https://github.com/httpie/httpie/issues/1165))
|
- Installed plugins are now listed in `--debug` output. ([#1165](https://github.com/httpie/httpie/issues/1165))
|
||||||
- Fixed duplicate keys preservation of JSON data. ([#1163](https://github.com/httpie/httpie/issues/1163))
|
- Fixed duplicate keys preservation of JSON data. ([#1163](https://github.com/httpie/httpie/issues/1163))
|
||||||
|
|
||||||
|
@ -1413,6 +1413,8 @@ HTTPie does several things by default in order to make its terminal output easy
|
|||||||
|
|
||||||
### Colors and formatting
|
### Colors and formatting
|
||||||
|
|
||||||
|
<!-- TODO: mention body colors/formatting are based on content-type + --response-mime (heuristics for JSON content-type) -->
|
||||||
|
|
||||||
Syntax highlighting is applied to HTTP headers and bodies (where it makes sense).
|
Syntax highlighting is applied to HTTP headers and bodies (where it makes sense).
|
||||||
You can choose your preferred color scheme via the `--style` option if you don’t like the default one.
|
You can choose your preferred color scheme via the `--style` option if you don’t like the default one.
|
||||||
There are dozens of styles available, here are just a few notable ones:
|
There are dozens of styles available, here are just a few notable ones:
|
||||||
@ -1449,12 +1451,11 @@ The `--format-options=opt1:value,opt2:value` option allows you to control how th
|
|||||||
when formatting is applied. The following options are available:
|
when formatting is applied. The following options are available:
|
||||||
|
|
||||||
| Option | Default value | Shortcuts |
|
| Option | Default value | Shortcuts |
|
||||||
| ---------------: | :-----------: | ----------------------------------------- |
|
| ---------------: | :-----------: | ------------------------ |
|
||||||
| `headers.sort` | `true` | `--sorted`, `--unsorted` |
|
| `headers.sort` | `true` | `--sorted`, `--unsorted` |
|
||||||
| `json.format` | `true` | N/A |
|
| `json.format` | `true` | N/A |
|
||||||
| `json.indent` | `4` | N/A |
|
| `json.indent` | `4` | N/A |
|
||||||
| `json.sort_keys` | `true` | `--sorted`, `--unsorted` |
|
| `json.sort_keys` | `true` | `--sorted`, `--unsorted` |
|
||||||
| `response.as` | `''` | [`--response-as`](#response-content-type) |
|
|
||||||
| `xml.format` | `true` | N/A |
|
| `xml.format` | `true` | N/A |
|
||||||
| `xml.indent` | `2` | N/A |
|
| `xml.indent` | `2` | N/A |
|
||||||
|
|
||||||
@ -1471,11 +1472,10 @@ sorting-related format options (currently it means JSON keys and headers):
|
|||||||
|
|
||||||
This is something you will typically store as one of the default options in your [config](#config) file.
|
This is something you will typically store as one of the default options in your [config](#config) file.
|
||||||
|
|
||||||
#### Response `Content-Type`
|
### Response `Content-Type`
|
||||||
|
|
||||||
The `--response-as=value` option is a shortcut for `--format-options response.as:value`,
|
The `--response-as=value` option allows you to override the response `Content-Type` sent by the server.
|
||||||
and it allows you to override the response `Content-Type` sent by the server.
|
That makes it possible for HTTPie to print the response even when the server specifies the type incorrectly.
|
||||||
That makes it possible for HTTPie to pretty-print the response even when the server specifies the type incorrectly.
|
|
||||||
|
|
||||||
For example, the following request will force the response to be treated as XML:
|
For example, the following request will force the response to be treated as XML:
|
||||||
|
|
||||||
@ -1495,27 +1495,6 @@ $ http --response-as='text/plain; charset=big5' pie.dev/get
|
|||||||
|
|
||||||
Given the encoding is not sent by the server, HTTPie will auto-detect it.
|
Given the encoding is not sent by the server, HTTPie will auto-detect it.
|
||||||
|
|
||||||
### Binary data
|
|
||||||
|
|
||||||
Binary data is suppressed for terminal output, which makes it safe to perform requests to URLs that send back binary data.
|
|
||||||
Binary data is also suppressed in redirected but prettified output.
|
|
||||||
The connection is closed as soon as we know that the response body is binary,
|
|
||||||
|
|
||||||
```bash
|
|
||||||
$ http pie.dev/bytes/2000
|
|
||||||
```
|
|
||||||
|
|
||||||
You will nearly instantly see something like this:
|
|
||||||
|
|
||||||
```http
|
|
||||||
HTTP/1.1 200 OK
|
|
||||||
Content-Type: application/octet-stream
|
|
||||||
|
|
||||||
+-----------------------------------------+
|
|
||||||
| NOTE: binary data not shown in terminal |
|
|
||||||
+-----------------------------------------+
|
|
||||||
```
|
|
||||||
|
|
||||||
### Redirected output
|
### Redirected output
|
||||||
|
|
||||||
HTTPie uses a different set of defaults for redirected output than for [terminal output](#terminal-output).
|
HTTPie uses a different set of defaults for redirected output than for [terminal output](#terminal-output).
|
||||||
@ -1557,6 +1536,42 @@ function httpless {
|
|||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Binary data
|
||||||
|
|
||||||
|
Binary data is suppressed for terminal output, which makes it safe to perform requests to URLs that send back binary data.
|
||||||
|
Binary data is also suppressed in redirected but prettified output.
|
||||||
|
The connection is closed as soon as we know that the response body is binary,
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ http pie.dev/bytes/2000
|
||||||
|
```
|
||||||
|
|
||||||
|
You will nearly instantly see something like this:
|
||||||
|
|
||||||
|
```http
|
||||||
|
HTTP/1.1 200 OK
|
||||||
|
Content-Type: application/octet-stream
|
||||||
|
|
||||||
|
+-----------------------------------------+
|
||||||
|
| NOTE: binary data not shown in terminal |
|
||||||
|
+-----------------------------------------+
|
||||||
|
```
|
||||||
|
|
||||||
|
<!--
|
||||||
|
### Display encoding
|
||||||
|
|
||||||
|
TODO:
|
||||||
|
(both request/response)
|
||||||
|
|
||||||
|
- we look at content-type
|
||||||
|
- else we detect
|
||||||
|
- short texts default to utf8
|
||||||
|
|
||||||
|
(only response)
|
||||||
|
|
||||||
|
- --response-charset allows overwriting
|
||||||
|
- -->
|
||||||
|
|
||||||
## Download mode
|
## Download mode
|
||||||
|
|
||||||
HTTPie features a download mode in which it acts similarly to `wget`.
|
HTTPie features a download mode in which it acts similarly to `wget`.
|
||||||
|
@ -458,8 +458,6 @@ class HTTPieArgumentParser(argparse.ArgumentParser):
|
|||||||
|
|
||||||
def _process_format_options(self):
|
def _process_format_options(self):
|
||||||
format_options = self.args.format_options or []
|
format_options = self.args.format_options or []
|
||||||
if self.args.response_as is not None:
|
|
||||||
format_options.append('response.as:' + self.args.response_as)
|
|
||||||
parsed_options = PARSED_DEFAULT_FORMAT_OPTIONS
|
parsed_options = PARSED_DEFAULT_FORMAT_OPTIONS
|
||||||
for options_group in format_options:
|
for options_group in format_options:
|
||||||
parsed_options = parse_format_options(options_group, defaults=parsed_options)
|
parsed_options = parse_format_options(options_group, defaults=parsed_options)
|
||||||
|
@ -242,3 +242,19 @@ PARSED_DEFAULT_FORMAT_OPTIONS = parse_format_options(
|
|||||||
s=','.join(DEFAULT_FORMAT_OPTIONS),
|
s=','.join(DEFAULT_FORMAT_OPTIONS),
|
||||||
defaults=None,
|
defaults=None,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def response_charset_type(encoding: str) -> str:
|
||||||
|
try:
|
||||||
|
''.encode(encoding)
|
||||||
|
except LookupError:
|
||||||
|
raise argparse.ArgumentTypeError(
|
||||||
|
f'{encoding!r} is not a supported encoding')
|
||||||
|
return encoding
|
||||||
|
|
||||||
|
|
||||||
|
def response_mime_type(mime_type: str) -> str:
|
||||||
|
if mime_type.count('/') != 1:
|
||||||
|
raise argparse.ArgumentTypeError(
|
||||||
|
f'{mime_type!r} doesn’t look like a mime type; use type/subtype')
|
||||||
|
return mime_type
|
||||||
|
@ -85,13 +85,11 @@ PRETTY_MAP = {
|
|||||||
PRETTY_STDOUT_TTY_ONLY = object()
|
PRETTY_STDOUT_TTY_ONLY = object()
|
||||||
|
|
||||||
|
|
||||||
EMPTY_FORMAT_OPTION = "''"
|
|
||||||
DEFAULT_FORMAT_OPTIONS = [
|
DEFAULT_FORMAT_OPTIONS = [
|
||||||
'headers.sort:true',
|
'headers.sort:true',
|
||||||
'json.format:true',
|
'json.format:true',
|
||||||
'json.indent:4',
|
'json.indent:4',
|
||||||
'json.sort_keys:true',
|
'json.sort_keys:true',
|
||||||
'response.as:' + EMPTY_FORMAT_OPTION,
|
|
||||||
'xml.format:true',
|
'xml.format:true',
|
||||||
'xml.indent:2',
|
'xml.indent:2',
|
||||||
]
|
]
|
||||||
|
@ -9,7 +9,7 @@ from .. import __doc__, __version__
|
|||||||
from .argparser import HTTPieArgumentParser
|
from .argparser import HTTPieArgumentParser
|
||||||
from .argtypes import (
|
from .argtypes import (
|
||||||
KeyValueArgType, SessionNameValidator,
|
KeyValueArgType, SessionNameValidator,
|
||||||
readable_file_arg,
|
readable_file_arg, response_charset_type, response_mime_type,
|
||||||
)
|
)
|
||||||
from .constants import (
|
from .constants import (
|
||||||
DEFAULT_FORMAT_OPTIONS, OUTPUT_OPTIONS,
|
DEFAULT_FORMAT_OPTIONS, OUTPUT_OPTIONS,
|
||||||
@ -310,21 +310,30 @@ output_processing.add_argument(
|
|||||||
)
|
)
|
||||||
|
|
||||||
output_processing.add_argument(
|
output_processing.add_argument(
|
||||||
'--response-as',
|
'--response-charset',
|
||||||
metavar='CONTENT_TYPE',
|
metavar='ENCODING',
|
||||||
|
type=response_charset_type,
|
||||||
help='''
|
help='''
|
||||||
Override the response Content-Type for formatting purposes, e.g.:
|
Override the response encoding for terminal display purposes, e.g.:
|
||||||
|
|
||||||
--response-as=application/xml
|
--response-charset=utf8
|
||||||
--response-as=charset=utf-8
|
--response-charset=big5
|
||||||
--response-as='application/xml; charset=utf-8'
|
|
||||||
|
|
||||||
It is a shortcut for:
|
|
||||||
|
|
||||||
--format-options=response.as:CONTENT_TYPE
|
|
||||||
'''
|
'''
|
||||||
)
|
)
|
||||||
|
|
||||||
|
output_processing.add_argument(
|
||||||
|
'--response-mime',
|
||||||
|
metavar='MIME_TYPE',
|
||||||
|
type=response_mime_type,
|
||||||
|
help='''
|
||||||
|
Override the response mime type for coloring and formatting for the terminal, e.g.:
|
||||||
|
|
||||||
|
--response-mime=application/json
|
||||||
|
--response-mime=text/xml
|
||||||
|
|
||||||
|
'''
|
||||||
|
)
|
||||||
|
|
||||||
output_processing.add_argument(
|
output_processing.add_argument(
|
||||||
'--format-options',
|
'--format-options',
|
||||||
|
@ -12,7 +12,7 @@ import requests
|
|||||||
import urllib3
|
import urllib3
|
||||||
from . import __version__
|
from . import __version__
|
||||||
from .cli.dicts import RequestHeadersDict
|
from .cli.dicts import RequestHeadersDict
|
||||||
from .constants import UTF8
|
from .encoding import UTF8
|
||||||
from .plugins.registry import plugin_manager
|
from .plugins.registry import plugin_manager
|
||||||
from .sessions import get_httpie_session
|
from .sessions import get_httpie_session
|
||||||
from .ssl import AVAILABLE_SSL_VERSION_ARG_MAPPING, HTTPieHTTPSAdapter
|
from .ssl import AVAILABLE_SSL_VERSION_ARG_MAPPING, HTTPieHTTPSAdapter
|
||||||
|
@ -1,37 +0,0 @@
|
|||||||
from typing import Union
|
|
||||||
|
|
||||||
from charset_normalizer import from_bytes
|
|
||||||
|
|
||||||
from .constants import UTF8
|
|
||||||
|
|
||||||
Bytes = Union[bytearray, bytes]
|
|
||||||
|
|
||||||
|
|
||||||
def detect_encoding(content: Bytes) -> str:
|
|
||||||
"""Detect the `content` encoding.
|
|
||||||
Fallback to UTF-8 when no suitable encoding found.
|
|
||||||
|
|
||||||
"""
|
|
||||||
match = from_bytes(bytes(content)).best()
|
|
||||||
return match.encoding if match else UTF8
|
|
||||||
|
|
||||||
|
|
||||||
def decode(content: Bytes, encoding: str) -> str:
|
|
||||||
"""Decode `content` using the given `encoding`.
|
|
||||||
If no `encoding` is provided, the best effort is to guess it from `content`.
|
|
||||||
|
|
||||||
Unicode errors are replaced.
|
|
||||||
|
|
||||||
"""
|
|
||||||
if not encoding:
|
|
||||||
encoding = detect_encoding(content)
|
|
||||||
return content.decode(encoding, 'replace')
|
|
||||||
|
|
||||||
|
|
||||||
def encode(content: str, encoding: str) -> bytes:
|
|
||||||
"""Encode `content` using the given `encoding`.
|
|
||||||
|
|
||||||
Unicode errors are replaced.
|
|
||||||
|
|
||||||
"""
|
|
||||||
return content.encode(encoding, 'replace')
|
|
@ -2,3 +2,53 @@ import sys
|
|||||||
|
|
||||||
|
|
||||||
is_windows = 'win32' in str(sys.platform).lower()
|
is_windows = 'win32' in str(sys.platform).lower()
|
||||||
|
|
||||||
|
|
||||||
|
try:
|
||||||
|
from functools import cached_property
|
||||||
|
except ImportError:
|
||||||
|
# Can be removed once we drop Python <3.8 support.
|
||||||
|
# Taken from `django.utils.functional.cached_property`.
|
||||||
|
class cached_property:
|
||||||
|
"""
|
||||||
|
Decorator that converts a method with a single self argument into a
|
||||||
|
property cached on the instance.
|
||||||
|
|
||||||
|
A cached property can be made out of an existing method:
|
||||||
|
(e.g. ``url = cached_property(get_absolute_url)``).
|
||||||
|
The optional ``name`` argument is obsolete as of Python 3.6 and will be
|
||||||
|
deprecated in Django 4.0 (#30127).
|
||||||
|
"""
|
||||||
|
name = None
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def func(instance):
|
||||||
|
raise TypeError(
|
||||||
|
'Cannot use cached_property instance without calling '
|
||||||
|
'__set_name__() on it.'
|
||||||
|
)
|
||||||
|
|
||||||
|
def __init__(self, func, name=None):
|
||||||
|
self.real_func = func
|
||||||
|
self.__doc__ = getattr(func, '__doc__')
|
||||||
|
|
||||||
|
def __set_name__(self, owner, name):
|
||||||
|
if self.name is None:
|
||||||
|
self.name = name
|
||||||
|
self.func = self.real_func
|
||||||
|
elif name != self.name:
|
||||||
|
raise TypeError(
|
||||||
|
"Cannot assign the same cached_property to two different names "
|
||||||
|
"(%r and %r)." % (self.name, name)
|
||||||
|
)
|
||||||
|
|
||||||
|
def __get__(self, instance, cls=None):
|
||||||
|
"""
|
||||||
|
Call the function and put the return value in instance.__dict__ so that
|
||||||
|
subsequent attribute access on the instance returns the cached value
|
||||||
|
instead of calling cached_property.__get__().
|
||||||
|
"""
|
||||||
|
if instance is None:
|
||||||
|
return self
|
||||||
|
res = instance.__dict__[self.name] = self.func(instance)
|
||||||
|
return res
|
||||||
|
@ -5,7 +5,7 @@ from typing import Union
|
|||||||
|
|
||||||
from . import __version__
|
from . import __version__
|
||||||
from .compat import is_windows
|
from .compat import is_windows
|
||||||
from .constants import UTF8
|
from .encoding import UTF8
|
||||||
|
|
||||||
|
|
||||||
ENV_XDG_CONFIG_HOME = 'XDG_CONFIG_HOME'
|
ENV_XDG_CONFIG_HOME = 'XDG_CONFIG_HOME'
|
||||||
|
@ -1,2 +0,0 @@
|
|||||||
# UTF-8 encoding name
|
|
||||||
UTF8 = 'utf-8'
|
|
@ -11,7 +11,7 @@ except ImportError:
|
|||||||
|
|
||||||
from .compat import is_windows
|
from .compat import is_windows
|
||||||
from .config import DEFAULT_CONFIG_DIR, Config, ConfigFileError
|
from .config import DEFAULT_CONFIG_DIR, Config, ConfigFileError
|
||||||
from .constants import UTF8
|
from .encoding import UTF8
|
||||||
|
|
||||||
from .utils import repr_dict
|
from .utils import repr_dict
|
||||||
|
|
||||||
|
50
httpie/encoding.py
Normal file
50
httpie/encoding.py
Normal file
@ -0,0 +1,50 @@
|
|||||||
|
from typing import Union
|
||||||
|
|
||||||
|
from charset_normalizer import from_bytes
|
||||||
|
from charset_normalizer.constant import TOO_SMALL_SEQUENCE
|
||||||
|
|
||||||
|
UTF8 = 'utf-8'
|
||||||
|
|
||||||
|
ContentBytes = Union[bytearray, bytes]
|
||||||
|
|
||||||
|
|
||||||
|
def detect_encoding(content: ContentBytes) -> str:
|
||||||
|
"""
|
||||||
|
We default to UTF-8 if text too short, because the detection
|
||||||
|
can return a random encoding leading to confusing results
|
||||||
|
given the `charset_normalizer` version (< 2.0.5).
|
||||||
|
|
||||||
|
>>> too_short = ']"foo"'
|
||||||
|
>>> detected = from_bytes(too_short.encode()).best().encoding
|
||||||
|
>>> detected
|
||||||
|
'ascii'
|
||||||
|
>>> too_short.encode().decode(detected)
|
||||||
|
']"foo"'
|
||||||
|
"""
|
||||||
|
encoding = UTF8
|
||||||
|
if len(content) > TOO_SMALL_SEQUENCE:
|
||||||
|
match = from_bytes(bytes(content)).best()
|
||||||
|
if match:
|
||||||
|
encoding = match.encoding
|
||||||
|
return encoding
|
||||||
|
|
||||||
|
|
||||||
|
def smart_decode(content: ContentBytes, encoding: str) -> str:
|
||||||
|
"""Decode `content` using the given `encoding`.
|
||||||
|
If no `encoding` is provided, the best effort is to guess it from `content`.
|
||||||
|
|
||||||
|
Unicode errors are replaced.
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not encoding:
|
||||||
|
encoding = detect_encoding(content)
|
||||||
|
return content.decode(encoding, 'replace')
|
||||||
|
|
||||||
|
|
||||||
|
def smart_encode(content: str, encoding: str) -> bytes:
|
||||||
|
"""Encode `content` using the given `encoding`.
|
||||||
|
|
||||||
|
Unicode errors are replaced.
|
||||||
|
|
||||||
|
"""
|
||||||
|
return content.encode(encoding, 'replace')
|
@ -1,34 +1,33 @@
|
|||||||
from abc import ABCMeta, abstractmethod
|
from typing import Iterable
|
||||||
from typing import Iterable, Optional
|
|
||||||
from urllib.parse import urlsplit
|
from urllib.parse import urlsplit
|
||||||
|
|
||||||
from .constants import UTF8
|
from .utils import split_cookies, parse_content_type_header
|
||||||
from .utils import split_cookies
|
from .compat import cached_property
|
||||||
|
|
||||||
|
|
||||||
class HTTPMessage(metaclass=ABCMeta):
|
class HTTPMessage:
|
||||||
"""Abstract class for HTTP messages."""
|
"""Abstract class for HTTP messages."""
|
||||||
|
|
||||||
def __init__(self, orig):
|
def __init__(self, orig):
|
||||||
self._orig = orig
|
self._orig = orig
|
||||||
|
|
||||||
@abstractmethod
|
|
||||||
def iter_body(self, chunk_size: int) -> Iterable[bytes]:
|
def iter_body(self, chunk_size: int) -> Iterable[bytes]:
|
||||||
"""Return an iterator over the body."""
|
"""Return an iterator over the body."""
|
||||||
|
raise NotImplementedError
|
||||||
|
|
||||||
@abstractmethod
|
|
||||||
def iter_lines(self, chunk_size: int) -> Iterable[bytes]:
|
def iter_lines(self, chunk_size: int) -> Iterable[bytes]:
|
||||||
"""Return an iterator over the body yielding (`line`, `line_feed`)."""
|
"""Return an iterator over the body yielding (`line`, `line_feed`)."""
|
||||||
|
raise NotImplementedError
|
||||||
|
|
||||||
@property
|
@property
|
||||||
@abstractmethod
|
|
||||||
def headers(self) -> str:
|
def headers(self) -> str:
|
||||||
"""Return a `str` with the message's headers."""
|
"""Return a `str` with the message's headers."""
|
||||||
|
raise NotImplementedError
|
||||||
|
|
||||||
@property
|
@cached_property
|
||||||
@abstractmethod
|
def encoding(self) -> str:
|
||||||
def encoding(self) -> Optional[str]:
|
ct, params = parse_content_type_header(self.content_type)
|
||||||
"""Return a `str` with the message's encoding, if known."""
|
return params.get('charset', '')
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def content_type(self) -> str:
|
def content_type(self) -> str:
|
||||||
@ -77,10 +76,6 @@ class HTTPResponse(HTTPMessage):
|
|||||||
)
|
)
|
||||||
return '\r\n'.join(headers)
|
return '\r\n'.join(headers)
|
||||||
|
|
||||||
@property
|
|
||||||
def encoding(self):
|
|
||||||
return self._orig.encoding or UTF8
|
|
||||||
|
|
||||||
|
|
||||||
class HTTPRequest(HTTPMessage):
|
class HTTPRequest(HTTPMessage):
|
||||||
"""A :class:`requests.models.Request` wrapper."""
|
"""A :class:`requests.models.Request` wrapper."""
|
||||||
@ -114,10 +109,6 @@ class HTTPRequest(HTTPMessage):
|
|||||||
headers = '\r\n'.join(headers).strip()
|
headers = '\r\n'.join(headers).strip()
|
||||||
return headers
|
return headers
|
||||||
|
|
||||||
@property
|
|
||||||
def encoding(self):
|
|
||||||
return UTF8
|
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def body(self):
|
def body(self):
|
||||||
body = self._orig.body
|
body = self._orig.body
|
||||||
|
@ -1,7 +1,7 @@
|
|||||||
import sys
|
import sys
|
||||||
from typing import TYPE_CHECKING, Optional
|
from typing import TYPE_CHECKING, Optional
|
||||||
|
|
||||||
from ...constants import UTF8
|
from ...encoding import UTF8
|
||||||
from ...plugins import FormatterPlugin
|
from ...plugins import FormatterPlugin
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
|
@ -33,7 +33,6 @@ class Formatting:
|
|||||||
:param kwargs: additional keyword arguments for processors
|
:param kwargs: additional keyword arguments for processors
|
||||||
|
|
||||||
"""
|
"""
|
||||||
self.options = kwargs['format_options']
|
|
||||||
available_plugins = plugin_manager.get_formatters_grouped()
|
available_plugins = plugin_manager.get_formatters_grouped()
|
||||||
self.enabled_plugins = []
|
self.enabled_plugins = []
|
||||||
for group in groups:
|
for group in groups:
|
||||||
|
@ -1,14 +1,12 @@
|
|||||||
from abc import ABCMeta, abstractmethod
|
from abc import ABCMeta, abstractmethod
|
||||||
from itertools import chain
|
from itertools import chain
|
||||||
from typing import Any, Callable, Dict, Iterable, Tuple, Union
|
from typing import Callable, Iterable, Union
|
||||||
|
|
||||||
from .. import codec
|
|
||||||
from ..cli.constants import EMPTY_FORMAT_OPTION
|
|
||||||
from ..context import Environment
|
|
||||||
from ..constants import UTF8
|
|
||||||
from ..models import HTTPMessage, HTTPResponse
|
|
||||||
from .processing import Conversion, Formatting
|
from .processing import Conversion, Formatting
|
||||||
from .utils import parse_header_content_type
|
from ..context import Environment
|
||||||
|
from ..encoding import smart_decode, smart_encode, UTF8
|
||||||
|
from ..models import HTTPMessage
|
||||||
|
|
||||||
|
|
||||||
BINARY_SUPPRESSED_NOTICE = (
|
BINARY_SUPPRESSED_NOTICE = (
|
||||||
b'\n'
|
b'\n'
|
||||||
@ -100,8 +98,16 @@ class EncodedStream(BaseStream):
|
|||||||
"""
|
"""
|
||||||
CHUNK_SIZE = 1
|
CHUNK_SIZE = 1
|
||||||
|
|
||||||
def __init__(self, env=Environment(), **kwargs):
|
def __init__(
|
||||||
|
self,
|
||||||
|
env=Environment(),
|
||||||
|
mime_overwrite: str = None,
|
||||||
|
encoding_overwrite: str = None,
|
||||||
|
**kwargs
|
||||||
|
):
|
||||||
super().__init__(**kwargs)
|
super().__init__(**kwargs)
|
||||||
|
self.mime = mime_overwrite or self.msg.content_type
|
||||||
|
self.encoding = encoding_overwrite or self.msg.encoding
|
||||||
if env.stdout_isatty:
|
if env.stdout_isatty:
|
||||||
# Use the encoding supported by the terminal.
|
# Use the encoding supported by the terminal.
|
||||||
output_encoding = env.stdout_encoding
|
output_encoding = env.stdout_encoding
|
||||||
@ -115,8 +121,8 @@ class EncodedStream(BaseStream):
|
|||||||
for line, lf in self.msg.iter_lines(self.CHUNK_SIZE):
|
for line, lf in self.msg.iter_lines(self.CHUNK_SIZE):
|
||||||
if b'\0' in line:
|
if b'\0' in line:
|
||||||
raise BinarySuppressedError()
|
raise BinarySuppressedError()
|
||||||
line = codec.decode(line, self.msg.encoding)
|
line = smart_decode(line, self.encoding)
|
||||||
yield codec.encode(line, self.output_encoding) + lf
|
yield smart_encode(line, self.output_encoding) + lf
|
||||||
|
|
||||||
|
|
||||||
class PrettyStream(EncodedStream):
|
class PrettyStream(EncodedStream):
|
||||||
@ -138,23 +144,6 @@ class PrettyStream(EncodedStream):
|
|||||||
super().__init__(**kwargs)
|
super().__init__(**kwargs)
|
||||||
self.formatting = formatting
|
self.formatting = formatting
|
||||||
self.conversion = conversion
|
self.conversion = conversion
|
||||||
self.mime, mime_options = self._get_mime_and_options()
|
|
||||||
self.encoding = mime_options.get('charset') or ''
|
|
||||||
|
|
||||||
def _get_mime_and_options(self) -> Tuple[str, Dict[str, Any]]:
|
|
||||||
# Defaults from the `Content-Type` header.
|
|
||||||
mime, options = parse_header_content_type(self.msg.content_type)
|
|
||||||
|
|
||||||
if not isinstance(self.msg, HTTPResponse):
|
|
||||||
return mime, options
|
|
||||||
|
|
||||||
# Override from the `--response-as` option.
|
|
||||||
forced_content_type = self.formatting.options['response']['as']
|
|
||||||
if forced_content_type == EMPTY_FORMAT_OPTION:
|
|
||||||
return mime, options
|
|
||||||
|
|
||||||
forced_mime, forced_options = parse_header_content_type(forced_content_type)
|
|
||||||
return (forced_mime or mime, forced_options or options)
|
|
||||||
|
|
||||||
def get_headers(self) -> bytes:
|
def get_headers(self) -> bytes:
|
||||||
return self.formatting.format_headers(
|
return self.formatting.format_headers(
|
||||||
@ -185,9 +174,9 @@ class PrettyStream(EncodedStream):
|
|||||||
if not isinstance(chunk, str):
|
if not isinstance(chunk, str):
|
||||||
# Text when a converter has been used,
|
# Text when a converter has been used,
|
||||||
# otherwise it will always be bytes.
|
# otherwise it will always be bytes.
|
||||||
chunk = codec.decode(chunk, self.encoding)
|
chunk = smart_decode(chunk, self.encoding)
|
||||||
chunk = self.formatting.format_body(content=chunk, mime=self.mime)
|
chunk = self.formatting.format_body(content=chunk, mime=self.mime)
|
||||||
return codec.encode(chunk, self.output_encoding)
|
return smart_encode(chunk, self.output_encoding)
|
||||||
|
|
||||||
|
|
||||||
class BufferedPrettyStream(PrettyStream):
|
class BufferedPrettyStream(PrettyStream):
|
||||||
|
@ -35,57 +35,3 @@ def parse_prefixed_json(data: str) -> Tuple[str, str]:
|
|||||||
data_prefix = matches[0] if matches else ''
|
data_prefix = matches[0] if matches else ''
|
||||||
body = data[len(data_prefix):]
|
body = data[len(data_prefix):]
|
||||||
return data_prefix, body
|
return data_prefix, body
|
||||||
|
|
||||||
|
|
||||||
def parse_header_content_type(line):
|
|
||||||
"""Parse a Content-Type like header.
|
|
||||||
Return the main Content-Type and a dictionary of options.
|
|
||||||
>>> parse_header_content_type('application/xml; charset=utf-8')
|
|
||||||
('application/xml', {'charset': 'utf-8'})
|
|
||||||
>>> parse_header_content_type('application/xml; charset = utf-8')
|
|
||||||
('application/xml', {'charset': 'utf-8'})
|
|
||||||
>>> parse_header_content_type('application/html+xml;ChArSeT="UTF-8"')
|
|
||||||
('application/html+xml', {'charset': 'UTF-8'})
|
|
||||||
>>> parse_header_content_type('application/xml')
|
|
||||||
('application/xml', {})
|
|
||||||
>>> parse_header_content_type(';charset=utf-8')
|
|
||||||
('', {'charset': 'utf-8'})
|
|
||||||
>>> parse_header_content_type('charset=utf-8')
|
|
||||||
('', {'charset': 'utf-8'})
|
|
||||||
>>> parse_header_content_type('multipart/mixed; boundary="gc0pJq0M:08jU534c0p"')
|
|
||||||
('multipart/mixed', {'boundary': 'gc0pJq0M:08jU534c0p'})
|
|
||||||
>>> parse_header_content_type('Message/Partial; number=3; total=3; id="oc=jpbe0M2Yt4s@foo.com"')
|
|
||||||
('Message/Partial', {'number': '3', 'total': '3', 'id': 'oc=jpbe0M2Yt4s@foo.com'})
|
|
||||||
"""
|
|
||||||
# Source: https://github.com/python/cpython/blob/bb3e0c2/Lib/cgi.py#L230
|
|
||||||
|
|
||||||
def _parseparam(s: str):
|
|
||||||
# Source: https://github.com/python/cpython/blob/bb3e0c2/Lib/cgi.py#L218
|
|
||||||
while s[:1] == ';':
|
|
||||||
s = s[1:]
|
|
||||||
end = s.find(';')
|
|
||||||
while end > 0 and (s.count('"', 0, end) - s.count('\\"', 0, end)) % 2:
|
|
||||||
end = s.find(';', end + 1)
|
|
||||||
if end < 0:
|
|
||||||
end = len(s)
|
|
||||||
f = s[:end]
|
|
||||||
yield f.strip()
|
|
||||||
s = s[end:]
|
|
||||||
|
|
||||||
# Special case: 'key=value' only (without starting with ';').
|
|
||||||
if ';' not in line and '=' in line:
|
|
||||||
line = ';' + line
|
|
||||||
|
|
||||||
parts = _parseparam(';' + line)
|
|
||||||
key = parts.__next__()
|
|
||||||
pdict = {}
|
|
||||||
for p in parts:
|
|
||||||
i = p.find('=')
|
|
||||||
if i >= 0:
|
|
||||||
name = p[:i].strip().lower()
|
|
||||||
value = p[i + 1:].strip()
|
|
||||||
if len(value) >= 2 and value[0] == value[-1] == '"':
|
|
||||||
value = value[1:-1]
|
|
||||||
value = value.replace('\\\\', '\\').replace('\\"', '"')
|
|
||||||
pdict[name] = value
|
|
||||||
return key, pdict
|
|
||||||
|
@ -5,7 +5,7 @@ from typing import IO, TextIO, Tuple, Type, Union
|
|||||||
import requests
|
import requests
|
||||||
|
|
||||||
from ..context import Environment
|
from ..context import Environment
|
||||||
from ..models import HTTPRequest, HTTPResponse
|
from ..models import HTTPRequest, HTTPResponse, HTTPMessage
|
||||||
from .processing import Conversion, Formatting
|
from .processing import Conversion, Formatting
|
||||||
from .streams import (
|
from .streams import (
|
||||||
BaseStream, BufferedPrettyStream, EncodedStream, PrettyStream, RawStream,
|
BaseStream, BufferedPrettyStream, EncodedStream, PrettyStream, RawStream,
|
||||||
@ -97,16 +97,17 @@ def build_output_stream_for_message(
|
|||||||
with_headers: bool,
|
with_headers: bool,
|
||||||
with_body: bool,
|
with_body: bool,
|
||||||
):
|
):
|
||||||
stream_class, stream_kwargs = get_stream_type_and_kwargs(
|
message_type = {
|
||||||
env=env,
|
|
||||||
args=args,
|
|
||||||
)
|
|
||||||
message_class = {
|
|
||||||
requests.PreparedRequest: HTTPRequest,
|
requests.PreparedRequest: HTTPRequest,
|
||||||
requests.Response: HTTPResponse,
|
requests.Response: HTTPResponse,
|
||||||
}[type(requests_message)]
|
}[type(requests_message)]
|
||||||
|
stream_class, stream_kwargs = get_stream_type_and_kwargs(
|
||||||
|
env=env,
|
||||||
|
args=args,
|
||||||
|
message_type=message_type,
|
||||||
|
)
|
||||||
yield from stream_class(
|
yield from stream_class(
|
||||||
msg=message_class(requests_message),
|
msg=message_type(requests_message),
|
||||||
with_headers=with_headers,
|
with_headers=with_headers,
|
||||||
with_body=with_body,
|
with_body=with_body,
|
||||||
**stream_kwargs,
|
**stream_kwargs,
|
||||||
@ -120,7 +121,8 @@ def build_output_stream_for_message(
|
|||||||
|
|
||||||
def get_stream_type_and_kwargs(
|
def get_stream_type_and_kwargs(
|
||||||
env: Environment,
|
env: Environment,
|
||||||
args: argparse.Namespace
|
args: argparse.Namespace,
|
||||||
|
message_type: Type[HTTPMessage],
|
||||||
) -> Tuple[Type['BaseStream'], dict]:
|
) -> Tuple[Type['BaseStream'], dict]:
|
||||||
"""Pick the right stream type and kwargs for it based on `env` and `args`.
|
"""Pick the right stream type and kwargs for it based on `env` and `args`.
|
||||||
|
|
||||||
@ -134,10 +136,19 @@ def get_stream_type_and_kwargs(
|
|||||||
else RawStream.CHUNK_SIZE
|
else RawStream.CHUNK_SIZE
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
elif args.prettify:
|
else:
|
||||||
stream_class = PrettyStream if args.stream else BufferedPrettyStream
|
stream_class = EncodedStream
|
||||||
stream_kwargs = {
|
stream_kwargs = {
|
||||||
'env': env,
|
'env': env,
|
||||||
|
}
|
||||||
|
if message_type is HTTPResponse:
|
||||||
|
stream_kwargs.update({
|
||||||
|
'mime_overwrite': args.response_mime,
|
||||||
|
'encoding_overwrite': args.response_charset,
|
||||||
|
})
|
||||||
|
if args.prettify:
|
||||||
|
stream_class = PrettyStream if args.stream else BufferedPrettyStream
|
||||||
|
stream_kwargs.update({
|
||||||
'conversion': Conversion(),
|
'conversion': Conversion(),
|
||||||
'formatting': Formatting(
|
'formatting': Formatting(
|
||||||
env=env,
|
env=env,
|
||||||
@ -146,11 +157,6 @@ def get_stream_type_and_kwargs(
|
|||||||
explicit_json=args.json,
|
explicit_json=args.json,
|
||||||
format_options=args.format_options,
|
format_options=args.format_options,
|
||||||
)
|
)
|
||||||
}
|
})
|
||||||
else:
|
|
||||||
stream_class = EncodedStream
|
|
||||||
stream_kwargs = {
|
|
||||||
'env': env
|
|
||||||
}
|
|
||||||
|
|
||||||
return stream_class, stream_kwargs
|
return stream_class, stream_kwargs
|
||||||
|
@ -189,3 +189,21 @@ def _max_age_to_expires(cookies, now):
|
|||||||
max_age = cookie.get('max-age')
|
max_age = cookie.get('max-age')
|
||||||
if max_age and max_age.isdigit():
|
if max_age and max_age.isdigit():
|
||||||
cookie['expires'] = now + float(max_age)
|
cookie['expires'] = now + float(max_age)
|
||||||
|
|
||||||
|
|
||||||
|
def parse_content_type_header(header):
|
||||||
|
"""Borrowed from requests."""
|
||||||
|
tokens = header.split(';')
|
||||||
|
content_type, params = tokens[0].strip(), tokens[1:]
|
||||||
|
params_dict = {}
|
||||||
|
items_to_strip = "\"' "
|
||||||
|
for param in params:
|
||||||
|
param = param.strip()
|
||||||
|
if param:
|
||||||
|
key, value = param, True
|
||||||
|
index_of_equals = param.find("=")
|
||||||
|
if index_of_equals != -1:
|
||||||
|
key = param[:index_of_equals].strip(items_to_strip)
|
||||||
|
value = param[index_of_equals + 1:].strip(items_to_strip)
|
||||||
|
params_dict[key.lower()] = value
|
||||||
|
return content_type, params_dict
|
||||||
|
5
tests/fixtures/__init__.py
vendored
5
tests/fixtures/__init__.py
vendored
@ -1,7 +1,8 @@
|
|||||||
"""Test data"""
|
"""Test data"""
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|
||||||
from httpie.constants import UTF8
|
from httpie.encoding import UTF8
|
||||||
|
from httpie.output.formatters.xml import pretty_xml, parse_xml
|
||||||
|
|
||||||
|
|
||||||
def patharg(path):
|
def patharg(path):
|
||||||
@ -35,3 +36,5 @@ FILE_CONTENT = FILE_PATH.read_text(encoding=UTF8).strip()
|
|||||||
JSON_FILE_CONTENT = JSON_FILE_PATH.read_text(encoding=UTF8)
|
JSON_FILE_CONTENT = JSON_FILE_PATH.read_text(encoding=UTF8)
|
||||||
BIN_FILE_CONTENT = BIN_FILE_PATH.read_bytes()
|
BIN_FILE_CONTENT = BIN_FILE_PATH.read_bytes()
|
||||||
UNICODE = FILE_CONTENT
|
UNICODE = FILE_CONTENT
|
||||||
|
XML_DATA_RAW = '<?xml version="1.0" encoding="utf-8"?><root><e>text</e></root>'
|
||||||
|
XML_DATA_FORMATTED = pretty_xml(parse_xml(XML_DATA_RAW))
|
||||||
|
@ -119,11 +119,11 @@ def test_ignore_netrc_with_auth_type_resulting_in_missing_auth(httpbin):
|
|||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
argnames=['auth_type', 'endpoint'],
|
'auth_type, endpoint',
|
||||||
argvalues=[
|
[
|
||||||
('basic', '/basic-auth/httpie/password'),
|
('basic', '/basic-auth/httpie/password'),
|
||||||
('digest', '/digest-auth/auth/httpie/password'),
|
('digest', '/digest-auth/auth/httpie/password'),
|
||||||
],
|
]
|
||||||
)
|
)
|
||||||
def test_auth_plugin_netrc_parse(auth_type, endpoint, httpbin):
|
def test_auth_plugin_netrc_parse(auth_type, endpoint, httpbin):
|
||||||
# Test
|
# Test
|
||||||
|
@ -51,7 +51,7 @@ class TestItemParsing:
|
|||||||
}
|
}
|
||||||
assert 'bar@baz' in items.files
|
assert 'bar@baz' in items.files
|
||||||
|
|
||||||
@pytest.mark.parametrize(('string', 'key', 'sep', 'value'), [
|
@pytest.mark.parametrize('string, key, sep, value', [
|
||||||
('path=c:\\windows', 'path', '=', 'c:\\windows'),
|
('path=c:\\windows', 'path', '=', 'c:\\windows'),
|
||||||
('path=c:\\windows\\', 'path', '=', 'c:\\windows\\'),
|
('path=c:\\windows\\', 'path', '=', 'c:\\windows\\'),
|
||||||
('path\\==c:\\windows', 'path=', '=', 'c:\\windows'),
|
('path\\==c:\\windows', 'path=', '=', 'c:\\windows'),
|
||||||
|
@ -4,7 +4,7 @@ import pytest
|
|||||||
from _pytest.monkeypatch import MonkeyPatch
|
from _pytest.monkeypatch import MonkeyPatch
|
||||||
|
|
||||||
from httpie.compat import is_windows
|
from httpie.compat import is_windows
|
||||||
from httpie.constants import UTF8
|
from httpie.encoding import UTF8
|
||||||
from httpie.config import (
|
from httpie.config import (
|
||||||
Config, DEFAULT_CONFIG_DIRNAME, DEFAULT_RELATIVE_LEGACY_CONFIG_DIR,
|
Config, DEFAULT_CONFIG_DIRNAME, DEFAULT_RELATIVE_LEGACY_CONFIG_DIR,
|
||||||
DEFAULT_RELATIVE_XDG_CONFIG_HOME, DEFAULT_WINDOWS_CONFIG_DIR,
|
DEFAULT_RELATIVE_XDG_CONFIG_HOME, DEFAULT_WINDOWS_CONFIG_DIR,
|
||||||
|
222
tests/test_encoding.py
Normal file
222
tests/test_encoding.py
Normal file
@ -0,0 +1,222 @@
|
|||||||
|
"""
|
||||||
|
Various encoding handling related tests.
|
||||||
|
|
||||||
|
"""
|
||||||
|
import pytest
|
||||||
|
import responses
|
||||||
|
from charset_normalizer.constant import TOO_SMALL_SEQUENCE
|
||||||
|
|
||||||
|
from httpie.cli.constants import PRETTY_MAP
|
||||||
|
from httpie.encoding import UTF8
|
||||||
|
|
||||||
|
from .utils import http, HTTP_OK, DUMMY_URL, MockEnvironment
|
||||||
|
from .fixtures import UNICODE
|
||||||
|
|
||||||
|
|
||||||
|
CHARSET_TEXT_PAIRS = [
|
||||||
|
('big5', '卷首卷首卷首卷首卷卷首卷首卷首卷首卷首卷首卷首卷首卷首卷首卷首卷首卷首'),
|
||||||
|
('windows-1250', 'Všichni lidé jsou si rovni. Všichni lidé jsou si rovni.'),
|
||||||
|
(UTF8, 'Všichni lidé jsou si rovni. Všichni lidé jsou si rovni.'),
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
def test_charset_text_pairs():
|
||||||
|
# Verify our test data is legit.
|
||||||
|
for charset, text in CHARSET_TEXT_PAIRS:
|
||||||
|
assert len(text) > TOO_SMALL_SEQUENCE
|
||||||
|
if charset != UTF8:
|
||||||
|
with pytest.raises(UnicodeDecodeError):
|
||||||
|
assert text != text.encode(charset).decode(UTF8)
|
||||||
|
|
||||||
|
|
||||||
|
def test_unicode_headers(httpbin):
|
||||||
|
# httpbin doesn't interpret UFT-8 headers
|
||||||
|
r = http(httpbin.url + '/headers', f'Test:{UNICODE}')
|
||||||
|
assert HTTP_OK in r
|
||||||
|
|
||||||
|
|
||||||
|
def test_unicode_headers_verbose(httpbin):
|
||||||
|
# httpbin doesn't interpret UTF-8 headers
|
||||||
|
r = http('--verbose', httpbin.url + '/headers', f'Test:{UNICODE}')
|
||||||
|
assert HTTP_OK in r
|
||||||
|
assert UNICODE in r
|
||||||
|
|
||||||
|
|
||||||
|
def test_unicode_raw(httpbin):
|
||||||
|
r = http('--raw', f'test {UNICODE}', 'POST', httpbin.url + '/post')
|
||||||
|
assert HTTP_OK in r
|
||||||
|
assert r.json['data'] == f'test {UNICODE}'
|
||||||
|
|
||||||
|
|
||||||
|
def test_unicode_raw_verbose(httpbin):
|
||||||
|
r = http('--verbose', '--raw', f'test {UNICODE}',
|
||||||
|
'POST', httpbin.url + '/post')
|
||||||
|
assert HTTP_OK in r
|
||||||
|
assert UNICODE in r
|
||||||
|
|
||||||
|
|
||||||
|
def test_unicode_form_item(httpbin):
|
||||||
|
r = http('--form', 'POST', httpbin.url + '/post', f'test={UNICODE}')
|
||||||
|
assert HTTP_OK in r
|
||||||
|
assert r.json['form'] == {'test': UNICODE}
|
||||||
|
|
||||||
|
|
||||||
|
def test_unicode_form_item_verbose(httpbin):
|
||||||
|
r = http('--verbose', '--form',
|
||||||
|
'POST', httpbin.url + '/post', f'test={UNICODE}')
|
||||||
|
assert HTTP_OK in r
|
||||||
|
assert UNICODE in r
|
||||||
|
|
||||||
|
|
||||||
|
def test_unicode_json_item(httpbin):
|
||||||
|
r = http('--json', 'POST', httpbin.url + '/post', f'test={UNICODE}')
|
||||||
|
assert HTTP_OK in r
|
||||||
|
assert r.json['json'] == {'test': UNICODE}
|
||||||
|
|
||||||
|
|
||||||
|
def test_unicode_json_item_verbose(httpbin):
|
||||||
|
r = http('--verbose', '--json',
|
||||||
|
'POST', httpbin.url + '/post', f'test={UNICODE}')
|
||||||
|
assert HTTP_OK in r
|
||||||
|
assert UNICODE in r
|
||||||
|
|
||||||
|
|
||||||
|
def test_unicode_raw_json_item(httpbin):
|
||||||
|
r = http('--json', 'POST', httpbin.url + '/post',
|
||||||
|
f'test:={{ "{UNICODE}" : [ "{UNICODE}" ] }}')
|
||||||
|
assert HTTP_OK in r
|
||||||
|
assert r.json['json'] == {'test': {UNICODE: [UNICODE]}}
|
||||||
|
|
||||||
|
|
||||||
|
def test_unicode_raw_json_item_verbose(httpbin):
|
||||||
|
r = http('--json', 'POST', httpbin.url + '/post',
|
||||||
|
f'test:={{ "{UNICODE}" : [ "{UNICODE}" ] }}')
|
||||||
|
assert HTTP_OK in r
|
||||||
|
assert r.json['json'] == {'test': {UNICODE: [UNICODE]}}
|
||||||
|
|
||||||
|
|
||||||
|
def test_unicode_url_query_arg_item(httpbin):
|
||||||
|
r = http(httpbin.url + '/get', f'test=={UNICODE}')
|
||||||
|
assert HTTP_OK in r
|
||||||
|
assert r.json['args'] == {'test': UNICODE}, r
|
||||||
|
|
||||||
|
|
||||||
|
def test_unicode_url_query_arg_item_verbose(httpbin):
|
||||||
|
r = http('--verbose', httpbin.url + '/get', f'test=={UNICODE}')
|
||||||
|
assert HTTP_OK in r
|
||||||
|
assert UNICODE in r
|
||||||
|
|
||||||
|
|
||||||
|
def test_unicode_url(httpbin):
|
||||||
|
r = http(f'{httpbin.url}/get?test={UNICODE}')
|
||||||
|
assert HTTP_OK in r
|
||||||
|
assert r.json['args'] == {'test': UNICODE}
|
||||||
|
|
||||||
|
|
||||||
|
def test_unicode_url_verbose(httpbin):
|
||||||
|
r = http('--verbose', f'{httpbin.url}/get?test={UNICODE}')
|
||||||
|
assert HTTP_OK in r
|
||||||
|
assert r.json['args'] == {'test': UNICODE}
|
||||||
|
|
||||||
|
|
||||||
|
def test_unicode_basic_auth(httpbin):
|
||||||
|
# it doesn't really authenticate us because httpbin
|
||||||
|
# doesn't interpret the UTF-8-encoded auth
|
||||||
|
http('--verbose', '--auth', f'test:{UNICODE}',
|
||||||
|
f'{httpbin.url}/basic-auth/test/{UNICODE}')
|
||||||
|
|
||||||
|
|
||||||
|
def test_unicode_digest_auth(httpbin):
|
||||||
|
# it doesn't really authenticate us because httpbin
|
||||||
|
# doesn't interpret the UTF-8-encoded auth
|
||||||
|
http('--auth-type=digest',
|
||||||
|
'--auth', f'test:{UNICODE}',
|
||||||
|
f'{httpbin.url}/digest-auth/auth/test/{UNICODE}')
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('charset, text', CHARSET_TEXT_PAIRS)
|
||||||
|
@responses.activate
|
||||||
|
def test_terminal_output_response_charset_detection(text, charset):
|
||||||
|
responses.add(
|
||||||
|
method=responses.POST,
|
||||||
|
url=DUMMY_URL,
|
||||||
|
body=text.encode(charset),
|
||||||
|
content_type='text/plain',
|
||||||
|
)
|
||||||
|
r = http('--form', 'POST', DUMMY_URL)
|
||||||
|
assert text in r
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('charset, text', CHARSET_TEXT_PAIRS)
|
||||||
|
@responses.activate
|
||||||
|
def test_terminal_output_response_content_type_charset(charset, text):
|
||||||
|
responses.add(
|
||||||
|
method=responses.POST,
|
||||||
|
url=DUMMY_URL,
|
||||||
|
body=text.encode(charset),
|
||||||
|
content_type=f'text/plain; charset={charset}',
|
||||||
|
)
|
||||||
|
r = http('--form', 'POST', DUMMY_URL)
|
||||||
|
assert text in r
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('charset, text', CHARSET_TEXT_PAIRS)
|
||||||
|
@pytest.mark.parametrize('pretty', PRETTY_MAP.keys())
|
||||||
|
@responses.activate
|
||||||
|
def test_terminal_output_response_content_type_charset_with_stream(charset, text, pretty):
|
||||||
|
responses.add(
|
||||||
|
method=responses.GET,
|
||||||
|
url=DUMMY_URL,
|
||||||
|
body=f'<?xml version="1.0"?>\n<c>{text}</c>'.encode(charset),
|
||||||
|
stream=True,
|
||||||
|
content_type=f'text/xml; charset={charset.upper()}',
|
||||||
|
)
|
||||||
|
r = http('--pretty', pretty, '--stream', DUMMY_URL)
|
||||||
|
assert text in r
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('charset, text', CHARSET_TEXT_PAIRS)
|
||||||
|
@pytest.mark.parametrize('pretty', PRETTY_MAP.keys())
|
||||||
|
@responses.activate
|
||||||
|
def test_terminal_output_response_charset_override(charset, text, pretty):
|
||||||
|
responses.add(
|
||||||
|
responses.GET,
|
||||||
|
DUMMY_URL,
|
||||||
|
body=text.encode(charset),
|
||||||
|
content_type='text/plain; charset=utf-8',
|
||||||
|
)
|
||||||
|
args = ['--pretty', pretty, DUMMY_URL]
|
||||||
|
if charset != UTF8:
|
||||||
|
# Content-Type charset wrong -> garbled text expected.
|
||||||
|
r = http(*args)
|
||||||
|
assert text not in r
|
||||||
|
r = http('--response-charset', charset, *args)
|
||||||
|
assert text in r
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('charset, text', CHARSET_TEXT_PAIRS)
|
||||||
|
def test_terminal_output_request_content_type_charset(charset, text):
|
||||||
|
r = http(
|
||||||
|
'--offline',
|
||||||
|
DUMMY_URL,
|
||||||
|
f'Content-Type: text/plain; charset={charset.upper()}',
|
||||||
|
env=MockEnvironment(
|
||||||
|
stdin=text.encode(charset),
|
||||||
|
stdin_isatty=False,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
assert text in r
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('charset, text', CHARSET_TEXT_PAIRS)
|
||||||
|
def test_terminal_output_request_charset_detection(charset, text):
|
||||||
|
r = http(
|
||||||
|
'--offline',
|
||||||
|
DUMMY_URL,
|
||||||
|
'Content-Type: text/plain',
|
||||||
|
env=MockEnvironment(
|
||||||
|
stdin=text.encode(charset),
|
||||||
|
stdin_isatty=False,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
assert text in r
|
@ -41,8 +41,19 @@ def test_max_headers_no_limit(httpbin_both):
|
|||||||
assert HTTP_OK in http('--max-headers=0', httpbin_both + '/get')
|
assert HTTP_OK in http('--max-headers=0', httpbin_both + '/get')
|
||||||
|
|
||||||
|
|
||||||
def test_charset_argument_unknown_encoding(httpbin_both):
|
def test_response_charset_option_unknown_encoding(httpbin_both):
|
||||||
with raises(LookupError) as e:
|
r = http(
|
||||||
http('--response-as', 'charset=foobar',
|
'--response-charset=foobar',
|
||||||
'GET', httpbin_both + '/get')
|
httpbin_both + '/get',
|
||||||
assert 'unknown encoding: foobar' in str(e.value)
|
tolerate_error_exit_status=True,
|
||||||
|
)
|
||||||
|
assert "'foobar' is not a supported encoding" in r.stderr
|
||||||
|
|
||||||
|
|
||||||
|
def test_response_mime_option_invalid_mime_type(httpbin_both):
|
||||||
|
r = http(
|
||||||
|
'--response-mime=foobar',
|
||||||
|
httpbin_both + '/get',
|
||||||
|
tolerate_error_exit_status=True,
|
||||||
|
)
|
||||||
|
assert "'foobar' doesn’t look like a mime type" in r.stderr
|
||||||
|
@ -9,7 +9,7 @@ import httpie.__main__
|
|||||||
from .fixtures import FILE_CONTENT, FILE_PATH
|
from .fixtures import FILE_CONTENT, FILE_PATH
|
||||||
from httpie.cli.exceptions import ParseError
|
from httpie.cli.exceptions import ParseError
|
||||||
from httpie.context import Environment
|
from httpie.context import Environment
|
||||||
from httpie.constants import UTF8
|
from httpie.encoding import UTF8
|
||||||
from httpie.status import ExitStatus
|
from httpie.status import ExitStatus
|
||||||
from .utils import HTTP_OK, MockEnvironment, StdinBytesIO, http
|
from .utils import HTTP_OK, MockEnvironment, StdinBytesIO, http
|
||||||
|
|
||||||
|
@ -9,10 +9,28 @@ from httpie.output.formatters.colors import ColorFormatter
|
|||||||
from httpie.utils import JsonDictPreservingDuplicateKeys
|
from httpie.utils import JsonDictPreservingDuplicateKeys
|
||||||
|
|
||||||
from .fixtures import JSON_WITH_DUPE_KEYS_FILE_PATH
|
from .fixtures import JSON_WITH_DUPE_KEYS_FILE_PATH
|
||||||
from .utils import MockEnvironment, http, URL_EXAMPLE
|
from .utils import MockEnvironment, http, DUMMY_URL
|
||||||
|
|
||||||
TEST_JSON_XXSI_PREFIXES = (r")]}',\n", ")]}',", 'while(1);', 'for(;;)', ')', ']', '}')
|
TEST_JSON_XXSI_PREFIXES = [
|
||||||
TEST_JSON_VALUES = ({}, {'a': 0, 'b': 0}, [], ['a', 'b'], 'foo', True, False, None) # FIX: missing int & float
|
r")]}',\n",
|
||||||
|
")]}',",
|
||||||
|
'while(1);',
|
||||||
|
'for(;;)',
|
||||||
|
')',
|
||||||
|
']',
|
||||||
|
'}',
|
||||||
|
]
|
||||||
|
TEST_JSON_VALUES = [
|
||||||
|
# FIXME: missing int & float
|
||||||
|
{},
|
||||||
|
{'a': 0, 'b': 0},
|
||||||
|
[],
|
||||||
|
['a', 'b'],
|
||||||
|
'foo',
|
||||||
|
True,
|
||||||
|
False,
|
||||||
|
None,
|
||||||
|
]
|
||||||
TEST_PREFIX_TOKEN_COLOR = '\x1b[38;5;15m' if is_windows else '\x1b[04m\x1b[91m'
|
TEST_PREFIX_TOKEN_COLOR = '\x1b[38;5;15m' if is_windows else '\x1b[04m\x1b[91m'
|
||||||
|
|
||||||
JSON_WITH_DUPES_RAW = '{"key": 15, "key": 15, "key": 3, "key": 7}'
|
JSON_WITH_DUPES_RAW = '{"key": 15, "key": 15, "key": 3, "key": 7}'
|
||||||
@ -37,15 +55,19 @@ JSON_WITH_DUPES_FORMATTED_UNSORTED = '''{
|
|||||||
def test_json_formatter_with_body_preceded_by_non_json_data(data_prefix, json_data, pretty):
|
def test_json_formatter_with_body_preceded_by_non_json_data(data_prefix, json_data, pretty):
|
||||||
"""Test JSON bodies preceded by non-JSON data."""
|
"""Test JSON bodies preceded by non-JSON data."""
|
||||||
body = data_prefix + json.dumps(json_data)
|
body = data_prefix + json.dumps(json_data)
|
||||||
content_type = 'application/json'
|
content_type = 'application/json;charset=utf8'
|
||||||
responses.add(responses.GET, URL_EXAMPLE, body=body,
|
responses.add(
|
||||||
content_type=content_type)
|
responses.GET,
|
||||||
|
DUMMY_URL,
|
||||||
|
body=body,
|
||||||
|
content_type=content_type,
|
||||||
|
)
|
||||||
|
|
||||||
colored_output = pretty in ('all', 'colors')
|
colored_output = pretty in {'all', 'colors'}
|
||||||
env = MockEnvironment(colors=256) if colored_output else None
|
env = MockEnvironment(colors=256) if colored_output else None
|
||||||
r = http('--pretty=' + pretty, URL_EXAMPLE, env=env)
|
r = http('--pretty', pretty, DUMMY_URL, env=env)
|
||||||
|
|
||||||
indent = None if pretty in ('none', 'colors') else 4
|
indent = None if pretty in {'none', 'colors'} else 4
|
||||||
expected_body = data_prefix + json.dumps(json_data, indent=indent)
|
expected_body = data_prefix + json.dumps(json_data, indent=indent)
|
||||||
if colored_output:
|
if colored_output:
|
||||||
fmt = ColorFormatter(env, format_options={'json': {'format': True, 'indent': 4}})
|
fmt = ColorFormatter(env, format_options={'json': {'format': True, 'indent': 4}})
|
||||||
@ -59,9 +81,13 @@ def test_json_formatter_with_body_preceded_by_non_json_data(data_prefix, json_da
|
|||||||
@responses.activate
|
@responses.activate
|
||||||
def test_duplicate_keys_support_from_response():
|
def test_duplicate_keys_support_from_response():
|
||||||
"""JSON with duplicate keys should be handled correctly."""
|
"""JSON with duplicate keys should be handled correctly."""
|
||||||
responses.add(responses.GET, URL_EXAMPLE, body=JSON_WITH_DUPES_RAW,
|
responses.add(
|
||||||
content_type='application/json')
|
responses.GET,
|
||||||
args = ('--pretty', 'format', URL_EXAMPLE)
|
DUMMY_URL,
|
||||||
|
body=JSON_WITH_DUPES_RAW,
|
||||||
|
content_type='application/json',
|
||||||
|
)
|
||||||
|
args = ('--pretty', 'format', DUMMY_URL)
|
||||||
|
|
||||||
# Check implicit --sorted
|
# Check implicit --sorted
|
||||||
if JsonDictPreservingDuplicateKeys.SUPPORTS_SORTING:
|
if JsonDictPreservingDuplicateKeys.SUPPORTS_SORTING:
|
||||||
@ -75,8 +101,12 @@ def test_duplicate_keys_support_from_response():
|
|||||||
|
|
||||||
def test_duplicate_keys_support_from_input_file():
|
def test_duplicate_keys_support_from_input_file():
|
||||||
"""JSON file with duplicate keys should be handled correctly."""
|
"""JSON file with duplicate keys should be handled correctly."""
|
||||||
args = ('--verbose', '--offline', URL_EXAMPLE,
|
args = (
|
||||||
f'@{JSON_WITH_DUPE_KEYS_FILE_PATH}')
|
'--verbose',
|
||||||
|
'--offline',
|
||||||
|
DUMMY_URL,
|
||||||
|
f'@{JSON_WITH_DUPE_KEYS_FILE_PATH}',
|
||||||
|
)
|
||||||
|
|
||||||
# Check implicit --sorted
|
# Check implicit --sorted
|
||||||
if JsonDictPreservingDuplicateKeys.SUPPORTS_SORTING:
|
if JsonDictPreservingDuplicateKeys.SUPPORTS_SORTING:
|
||||||
|
@ -9,16 +9,18 @@ from urllib.request import urlopen
|
|||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
import requests
|
import requests
|
||||||
|
import responses
|
||||||
|
|
||||||
from httpie.cli.argtypes import (
|
from httpie.cli.argtypes import (
|
||||||
PARSED_DEFAULT_FORMAT_OPTIONS,
|
PARSED_DEFAULT_FORMAT_OPTIONS,
|
||||||
parse_format_options,
|
parse_format_options,
|
||||||
)
|
)
|
||||||
from httpie.cli.definition import parser
|
from httpie.cli.definition import parser
|
||||||
from httpie.constants import UTF8
|
from httpie.encoding import UTF8
|
||||||
from httpie.output.formatters.colors import get_lexer
|
from httpie.output.formatters.colors import get_lexer
|
||||||
from httpie.status import ExitStatus
|
from httpie.status import ExitStatus
|
||||||
from .utils import COLOR, CRLF, HTTP_OK, MockEnvironment, http
|
from .fixtures import XML_DATA_RAW, XML_DATA_FORMATTED
|
||||||
|
from .utils import COLOR, CRLF, HTTP_OK, MockEnvironment, http, DUMMY_URL
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize('stdout_isatty', [True, False])
|
@pytest.mark.parametrize('stdout_isatty', [True, False])
|
||||||
@ -168,8 +170,8 @@ class TestVerboseFlag:
|
|||||||
class TestColors:
|
class TestColors:
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
argnames=['mime', 'explicit_json', 'body', 'expected_lexer_name'],
|
'mime, explicit_json, body, expected_lexer_name',
|
||||||
argvalues=[
|
[
|
||||||
('application/json', False, None, 'JSON'),
|
('application/json', False, None, 'JSON'),
|
||||||
('application/json+foo', False, None, 'JSON'),
|
('application/json+foo', False, None, 'JSON'),
|
||||||
('application/foo+json', False, None, 'JSON'),
|
('application/foo+json', False, None, 'JSON'),
|
||||||
@ -302,8 +304,8 @@ class TestFormatOptions:
|
|||||||
assert f'ZZZ: foo{CRLF}XXX: foo' in r_unsorted
|
assert f'ZZZ: foo{CRLF}XXX: foo' in r_unsorted
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
argnames=['options', 'expected_json'],
|
'options, expected_json',
|
||||||
argvalues=[
|
[
|
||||||
# @formatter:off
|
# @formatter:off
|
||||||
(
|
(
|
||||||
'json.sort_keys:true,json.indent:4',
|
'json.sort_keys:true,json.indent:4',
|
||||||
@ -329,8 +331,8 @@ class TestFormatOptions:
|
|||||||
assert expected_json in r
|
assert expected_json in r
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
argnames=['defaults', 'options_string', 'expected'],
|
'defaults, options_string, expected',
|
||||||
argvalues=[
|
[
|
||||||
# @formatter:off
|
# @formatter:off
|
||||||
({'foo': {'bar': 1}}, 'foo.bar:2', {'foo': {'bar': 2}}),
|
({'foo': {'bar': 1}}, 'foo.bar:2', {'foo': {'bar': 2}}),
|
||||||
({'foo': {'bar': True}}, 'foo.bar:false', {'foo': {'bar': False}}),
|
({'foo': {'bar': True}}, 'foo.bar:false', {'foo': {'bar': False}}),
|
||||||
@ -343,8 +345,8 @@ class TestFormatOptions:
|
|||||||
assert expected == actual
|
assert expected == actual
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
argnames=['options_string', 'expected_error'],
|
'options_string, expected_error',
|
||||||
argvalues=[
|
[
|
||||||
('foo:2', 'invalid option'),
|
('foo:2', 'invalid option'),
|
||||||
('foo.baz:2', 'invalid key'),
|
('foo.baz:2', 'invalid key'),
|
||||||
('foo.bar:false', 'expected int got bool'),
|
('foo.bar:false', 'expected int got bool'),
|
||||||
@ -360,8 +362,8 @@ class TestFormatOptions:
|
|||||||
parse_format_options(s=options_string, defaults=defaults)
|
parse_format_options(s=options_string, defaults=defaults)
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
argnames=['args', 'expected_format_options'],
|
'args, expected_format_options',
|
||||||
argvalues=[
|
[
|
||||||
(
|
(
|
||||||
[
|
[
|
||||||
'--format-options',
|
'--format-options',
|
||||||
@ -377,9 +379,6 @@ class TestFormatOptions:
|
|||||||
'indent': 10,
|
'indent': 10,
|
||||||
'format': True
|
'format': True
|
||||||
},
|
},
|
||||||
'response': {
|
|
||||||
'as': "''",
|
|
||||||
},
|
|
||||||
'xml': {
|
'xml': {
|
||||||
'format': True,
|
'format': True,
|
||||||
'indent': 2,
|
'indent': 2,
|
||||||
@ -399,9 +398,6 @@ class TestFormatOptions:
|
|||||||
'indent': 4,
|
'indent': 4,
|
||||||
'format': True
|
'format': True
|
||||||
},
|
},
|
||||||
'response': {
|
|
||||||
'as': "''",
|
|
||||||
},
|
|
||||||
'xml': {
|
'xml': {
|
||||||
'format': True,
|
'format': True,
|
||||||
'indent': 2,
|
'indent': 2,
|
||||||
@ -423,9 +419,6 @@ class TestFormatOptions:
|
|||||||
'indent': 4,
|
'indent': 4,
|
||||||
'format': True
|
'format': True
|
||||||
},
|
},
|
||||||
'response': {
|
|
||||||
'as': "''",
|
|
||||||
},
|
|
||||||
'xml': {
|
'xml': {
|
||||||
'format': True,
|
'format': True,
|
||||||
'indent': 2,
|
'indent': 2,
|
||||||
@ -444,7 +437,6 @@ class TestFormatOptions:
|
|||||||
(
|
(
|
||||||
[
|
[
|
||||||
'--format-options=json.indent:2',
|
'--format-options=json.indent:2',
|
||||||
'--format-options=response.as:application/xml; charset=utf-8',
|
|
||||||
'--format-options=xml.format:false',
|
'--format-options=xml.format:false',
|
||||||
'--format-options=xml.indent:4',
|
'--format-options=xml.indent:4',
|
||||||
'--unsorted',
|
'--unsorted',
|
||||||
@ -459,9 +451,6 @@ class TestFormatOptions:
|
|||||||
'indent': 2,
|
'indent': 2,
|
||||||
'format': True
|
'format': True
|
||||||
},
|
},
|
||||||
'response': {
|
|
||||||
'as': 'application/xml; charset=utf-8',
|
|
||||||
},
|
|
||||||
'xml': {
|
'xml': {
|
||||||
'format': False,
|
'format': False,
|
||||||
'indent': 4,
|
'indent': 4,
|
||||||
@ -483,9 +472,6 @@ class TestFormatOptions:
|
|||||||
'indent': 2,
|
'indent': 2,
|
||||||
'format': True
|
'format': True
|
||||||
},
|
},
|
||||||
'response': {
|
|
||||||
'as': "''",
|
|
||||||
},
|
|
||||||
'xml': {
|
'xml': {
|
||||||
'format': True,
|
'format': True,
|
||||||
'indent': 2,
|
'indent': 2,
|
||||||
@ -508,9 +494,6 @@ class TestFormatOptions:
|
|||||||
'indent': 2,
|
'indent': 2,
|
||||||
'format': True
|
'format': True
|
||||||
},
|
},
|
||||||
'response': {
|
|
||||||
'as': "''",
|
|
||||||
},
|
|
||||||
'xml': {
|
'xml': {
|
||||||
'format': True,
|
'format': True,
|
||||||
'indent': 2,
|
'indent': 2,
|
||||||
@ -525,3 +508,35 @@ class TestFormatOptions:
|
|||||||
env=MockEnvironment(),
|
env=MockEnvironment(),
|
||||||
)
|
)
|
||||||
assert parsed_args.format_options == expected_format_options
|
assert parsed_args.format_options == expected_format_options
|
||||||
|
|
||||||
|
|
||||||
|
@responses.activate
|
||||||
|
def test_response_mime_overwrite():
|
||||||
|
responses.add(
|
||||||
|
method=responses.GET,
|
||||||
|
url=DUMMY_URL,
|
||||||
|
body=XML_DATA_RAW,
|
||||||
|
content_type='text/plain',
|
||||||
|
)
|
||||||
|
r = http(
|
||||||
|
'--offline',
|
||||||
|
'--raw', XML_DATA_RAW,
|
||||||
|
'--response-mime=application/xml', DUMMY_URL
|
||||||
|
)
|
||||||
|
assert XML_DATA_RAW in r # not affecting request bodies
|
||||||
|
|
||||||
|
r = http('--response-mime=application/xml', DUMMY_URL)
|
||||||
|
assert XML_DATA_FORMATTED in r
|
||||||
|
|
||||||
|
|
||||||
|
@responses.activate
|
||||||
|
def test_response_mime_overwrite_incorrect():
|
||||||
|
responses.add(
|
||||||
|
method=responses.GET,
|
||||||
|
url=DUMMY_URL,
|
||||||
|
body=XML_DATA_RAW,
|
||||||
|
content_type='text/xml',
|
||||||
|
)
|
||||||
|
# The provided Content-Type is simply ignored, and so no formatting is done.
|
||||||
|
r = http('--response-mime=incorrect/type', DUMMY_URL)
|
||||||
|
assert XML_DATA_RAW in r
|
||||||
|
@ -7,7 +7,7 @@ from unittest import mock
|
|||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
from .fixtures import FILE_PATH_ARG, UNICODE
|
from .fixtures import FILE_PATH_ARG, UNICODE
|
||||||
from httpie.constants import UTF8
|
from httpie.encoding import UTF8
|
||||||
from httpie.plugins import AuthPlugin
|
from httpie.plugins import AuthPlugin
|
||||||
from httpie.plugins.builtin import HTTPBasicAuth
|
from httpie.plugins.builtin import HTTPBasicAuth
|
||||||
from httpie.plugins.registry import plugin_manager
|
from httpie.plugins.registry import plugin_manager
|
||||||
@ -239,8 +239,8 @@ class TestSession(SessionTestBase):
|
|||||||
os.chdir(cwd)
|
os.chdir(cwd)
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
argnames=['auth_require_param', 'auth_parse_param'],
|
'auth_require_param, auth_parse_param',
|
||||||
argvalues=[
|
[
|
||||||
(False, False),
|
(False, False),
|
||||||
(False, True),
|
(False, True),
|
||||||
(True, False)
|
(True, False)
|
||||||
@ -337,8 +337,8 @@ class TestSession(SessionTestBase):
|
|||||||
class TestExpiredCookies(CookieTestBase):
|
class TestExpiredCookies(CookieTestBase):
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
argnames=['initial_cookie', 'expired_cookie'],
|
'initial_cookie, expired_cookie',
|
||||||
argvalues=[
|
[
|
||||||
({'id': {'value': 123}}, 'id'),
|
({'id': {'value': 123}}, 'id'),
|
||||||
({'id': {'value': 123}}, 'token')
|
({'id': {'value': 123}}, 'token')
|
||||||
]
|
]
|
||||||
@ -369,8 +369,8 @@ class TestExpiredCookies(CookieTestBase):
|
|||||||
assert get_expired_cookies(cookies, now=None) == expected_expired
|
assert get_expired_cookies(cookies, now=None) == expected_expired
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
argnames=['cookies', 'now', 'expected_expired'],
|
'cookies, now, expected_expired',
|
||||||
argvalues=[
|
[
|
||||||
(
|
(
|
||||||
'hello=world; Path=/; Expires=Thu, 01-Jan-1970 00:00:00 GMT; HttpOnly',
|
'hello=world; Path=/; Expires=Thu, 01-Jan-1970 00:00:00 GMT; HttpOnly',
|
||||||
None,
|
None,
|
||||||
@ -413,8 +413,8 @@ class TestExpiredCookies(CookieTestBase):
|
|||||||
class TestCookieStorage(CookieTestBase):
|
class TestCookieStorage(CookieTestBase):
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
argnames=['new_cookies', 'new_cookies_dict', 'expected'],
|
'new_cookies, new_cookies_dict, expected',
|
||||||
argvalues=[(
|
[(
|
||||||
'new=bar',
|
'new=bar',
|
||||||
{'new': 'bar'},
|
{'new': 'bar'},
|
||||||
'cookie1=foo; cookie2=foo; new=bar'
|
'cookie1=foo; cookie2=foo; new=bar'
|
||||||
@ -457,8 +457,8 @@ class TestCookieStorage(CookieTestBase):
|
|||||||
assert 'Cookie' not in updated_session['headers']
|
assert 'Cookie' not in updated_session['headers']
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
argnames=['cli_cookie', 'set_cookie', 'expected'],
|
'cli_cookie, set_cookie, expected',
|
||||||
argvalues=[(
|
[(
|
||||||
'',
|
'',
|
||||||
'/cookies/set/cookie1/bar',
|
'/cookies/set/cookie1/bar',
|
||||||
'bar'
|
'bar'
|
||||||
|
@ -9,7 +9,7 @@ from httpie.output.streams import BINARY_SUPPRESSED_NOTICE
|
|||||||
from httpie.plugins import ConverterPlugin
|
from httpie.plugins import ConverterPlugin
|
||||||
from httpie.plugins.registry import plugin_manager
|
from httpie.plugins.registry import plugin_manager
|
||||||
|
|
||||||
from .utils import StdinBytesIO, http, MockEnvironment, URL_EXAMPLE
|
from .utils import StdinBytesIO, http, MockEnvironment, DUMMY_URL
|
||||||
from .fixtures import BIN_FILE_CONTENT, BIN_FILE_PATH
|
from .fixtures import BIN_FILE_CONTENT, BIN_FILE_PATH
|
||||||
|
|
||||||
PRETTY_OPTIONS = list(PRETTY_MAP.keys())
|
PRETTY_OPTIONS = list(PRETTY_MAP.keys())
|
||||||
@ -65,10 +65,10 @@ def test_pretty_options_with_and_without_stream_with_converter(pretty, stream):
|
|||||||
assert 'SortJSONConverterPlugin' in str(plugin_manager)
|
assert 'SortJSONConverterPlugin' in str(plugin_manager)
|
||||||
|
|
||||||
body = b'\x00{"foo":42,\n"bar":"baz"}'
|
body = b'\x00{"foo":42,\n"bar":"baz"}'
|
||||||
responses.add(responses.GET, URL_EXAMPLE, body=body,
|
responses.add(responses.GET, DUMMY_URL, body=body,
|
||||||
stream=True, content_type='json/bytes')
|
stream=True, content_type='json/bytes')
|
||||||
|
|
||||||
args = ['--pretty=' + pretty, 'GET', URL_EXAMPLE]
|
args = ['--pretty=' + pretty, 'GET', DUMMY_URL]
|
||||||
if stream:
|
if stream:
|
||||||
args.insert(0, '--stream')
|
args.insert(0, '--stream')
|
||||||
r = http(*args)
|
r = http(*args)
|
||||||
|
@ -1,211 +0,0 @@
|
|||||||
"""
|
|
||||||
Various unicode handling related tests.
|
|
||||||
|
|
||||||
"""
|
|
||||||
import pytest
|
|
||||||
import responses
|
|
||||||
|
|
||||||
from httpie.cli.constants import PRETTY_MAP
|
|
||||||
from httpie.constants import UTF8
|
|
||||||
|
|
||||||
from .utils import http, HTTP_OK, URL_EXAMPLE
|
|
||||||
from .fixtures import UNICODE
|
|
||||||
|
|
||||||
ENCODINGS = [UTF8, 'windows-1250']
|
|
||||||
|
|
||||||
|
|
||||||
def test_unicode_headers(httpbin):
|
|
||||||
# httpbin doesn't interpret UFT-8 headers
|
|
||||||
r = http(httpbin.url + '/headers', f'Test:{UNICODE}')
|
|
||||||
assert HTTP_OK in r
|
|
||||||
|
|
||||||
|
|
||||||
def test_unicode_headers_verbose(httpbin):
|
|
||||||
# httpbin doesn't interpret UTF-8 headers
|
|
||||||
r = http('--verbose', httpbin.url + '/headers', f'Test:{UNICODE}')
|
|
||||||
assert HTTP_OK in r
|
|
||||||
assert UNICODE in r
|
|
||||||
|
|
||||||
|
|
||||||
def test_unicode_raw(httpbin):
|
|
||||||
r = http('--raw', f'test {UNICODE}', 'POST', httpbin.url + '/post')
|
|
||||||
assert HTTP_OK in r
|
|
||||||
assert r.json['data'] == f'test {UNICODE}'
|
|
||||||
|
|
||||||
|
|
||||||
def test_unicode_raw_verbose(httpbin):
|
|
||||||
r = http('--verbose', '--raw', f'test {UNICODE}',
|
|
||||||
'POST', httpbin.url + '/post')
|
|
||||||
assert HTTP_OK in r
|
|
||||||
assert UNICODE in r
|
|
||||||
|
|
||||||
|
|
||||||
def test_unicode_form_item(httpbin):
|
|
||||||
r = http('--form', 'POST', httpbin.url + '/post', f'test={UNICODE}')
|
|
||||||
assert HTTP_OK in r
|
|
||||||
assert r.json['form'] == {'test': UNICODE}
|
|
||||||
|
|
||||||
|
|
||||||
def test_unicode_form_item_verbose(httpbin):
|
|
||||||
r = http('--verbose', '--form',
|
|
||||||
'POST', httpbin.url + '/post', f'test={UNICODE}')
|
|
||||||
assert HTTP_OK in r
|
|
||||||
assert UNICODE in r
|
|
||||||
|
|
||||||
|
|
||||||
def test_unicode_json_item(httpbin):
|
|
||||||
r = http('--json', 'POST', httpbin.url + '/post', f'test={UNICODE}')
|
|
||||||
assert HTTP_OK in r
|
|
||||||
assert r.json['json'] == {'test': UNICODE}
|
|
||||||
|
|
||||||
|
|
||||||
def test_unicode_json_item_verbose(httpbin):
|
|
||||||
r = http('--verbose', '--json',
|
|
||||||
'POST', httpbin.url + '/post', f'test={UNICODE}')
|
|
||||||
assert HTTP_OK in r
|
|
||||||
assert UNICODE in r
|
|
||||||
|
|
||||||
|
|
||||||
def test_unicode_raw_json_item(httpbin):
|
|
||||||
r = http('--json', 'POST', httpbin.url + '/post',
|
|
||||||
f'test:={{ "{UNICODE}" : [ "{UNICODE}" ] }}')
|
|
||||||
assert HTTP_OK in r
|
|
||||||
assert r.json['json'] == {'test': {UNICODE: [UNICODE]}}
|
|
||||||
|
|
||||||
|
|
||||||
def test_unicode_raw_json_item_verbose(httpbin):
|
|
||||||
r = http('--json', 'POST', httpbin.url + '/post',
|
|
||||||
f'test:={{ "{UNICODE}" : [ "{UNICODE}" ] }}')
|
|
||||||
assert HTTP_OK in r
|
|
||||||
assert r.json['json'] == {'test': {UNICODE: [UNICODE]}}
|
|
||||||
|
|
||||||
|
|
||||||
def test_unicode_url_query_arg_item(httpbin):
|
|
||||||
r = http(httpbin.url + '/get', f'test=={UNICODE}')
|
|
||||||
assert HTTP_OK in r
|
|
||||||
assert r.json['args'] == {'test': UNICODE}, r
|
|
||||||
|
|
||||||
|
|
||||||
def test_unicode_url_query_arg_item_verbose(httpbin):
|
|
||||||
r = http('--verbose', httpbin.url + '/get', f'test=={UNICODE}')
|
|
||||||
assert HTTP_OK in r
|
|
||||||
assert UNICODE in r
|
|
||||||
|
|
||||||
|
|
||||||
def test_unicode_url(httpbin):
|
|
||||||
r = http(f'{httpbin.url}/get?test={UNICODE}')
|
|
||||||
assert HTTP_OK in r
|
|
||||||
assert r.json['args'] == {'test': UNICODE}
|
|
||||||
|
|
||||||
|
|
||||||
def test_unicode_url_verbose(httpbin):
|
|
||||||
r = http('--verbose', f'{httpbin.url}/get?test={UNICODE}')
|
|
||||||
assert HTTP_OK in r
|
|
||||||
assert r.json['args'] == {'test': UNICODE}
|
|
||||||
|
|
||||||
|
|
||||||
def test_unicode_basic_auth(httpbin):
|
|
||||||
# it doesn't really authenticate us because httpbin
|
|
||||||
# doesn't interpret the UTF-8-encoded auth
|
|
||||||
http('--verbose', '--auth', f'test:{UNICODE}',
|
|
||||||
f'{httpbin.url}/basic-auth/test/{UNICODE}')
|
|
||||||
|
|
||||||
|
|
||||||
def test_unicode_digest_auth(httpbin):
|
|
||||||
# it doesn't really authenticate us because httpbin
|
|
||||||
# doesn't interpret the UTF-8-encoded auth
|
|
||||||
http('--auth-type=digest',
|
|
||||||
'--auth', f'test:{UNICODE}',
|
|
||||||
f'{httpbin.url}/digest-auth/auth/test/{UNICODE}')
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize('encoding', ENCODINGS)
|
|
||||||
@responses.activate
|
|
||||||
def test_GET_encoding_detection_from_content_type_header(encoding):
|
|
||||||
responses.add(responses.GET,
|
|
||||||
URL_EXAMPLE,
|
|
||||||
body='<?xml version="1.0"?>\n<c>Financiën</c>'.encode(encoding),
|
|
||||||
content_type=f'text/xml; charset={encoding.upper()}')
|
|
||||||
r = http('GET', URL_EXAMPLE)
|
|
||||||
assert 'Financiën' in r
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize('encoding', ENCODINGS)
|
|
||||||
@responses.activate
|
|
||||||
def test_GET_encoding_detection_from_content(encoding):
|
|
||||||
body = f'<?xml version="1.0" encoding="{encoding.upper()}"?>\n<c>Financiën</c>'
|
|
||||||
responses.add(responses.GET,
|
|
||||||
URL_EXAMPLE,
|
|
||||||
body=body.encode(encoding),
|
|
||||||
content_type='text/xml')
|
|
||||||
r = http('GET', URL_EXAMPLE)
|
|
||||||
assert 'Financiën' in r
|
|
||||||
|
|
||||||
|
|
||||||
@responses.activate
|
|
||||||
def test_GET_encoding_provided_by_format_options():
|
|
||||||
responses.add(responses.GET,
|
|
||||||
URL_EXAMPLE,
|
|
||||||
body='▒▒▒'.encode('johab'),
|
|
||||||
content_type='text/plain')
|
|
||||||
r = http('--format-options', 'response.as:text/plain; charset=johab',
|
|
||||||
'GET', URL_EXAMPLE)
|
|
||||||
assert '▒▒▒' in r
|
|
||||||
|
|
||||||
|
|
||||||
@responses.activate
|
|
||||||
def test_GET_encoding_provided_by_shortcut_option():
|
|
||||||
responses.add(responses.GET,
|
|
||||||
URL_EXAMPLE,
|
|
||||||
body='▒▒▒'.encode('johab'),
|
|
||||||
content_type='text/plain')
|
|
||||||
r = http('--response-as', 'text/plain; charset=johab',
|
|
||||||
'GET', URL_EXAMPLE)
|
|
||||||
assert '▒▒▒' in r
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize('encoding', ENCODINGS)
|
|
||||||
@responses.activate
|
|
||||||
def test_GET_encoding_provided_by_empty_shortcut_option_should_use_content_detection(encoding):
|
|
||||||
body = f'<?xml version="1.0" encoding="{encoding.upper()}"?>\n<c>Financiën</c>'
|
|
||||||
responses.add(responses.GET,
|
|
||||||
URL_EXAMPLE,
|
|
||||||
body=body.encode(encoding),
|
|
||||||
content_type='text/xml')
|
|
||||||
r = http('--response-as', '', 'GET', URL_EXAMPLE)
|
|
||||||
assert 'Financiën' in r
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize('encoding', ENCODINGS)
|
|
||||||
@responses.activate
|
|
||||||
def test_POST_encoding_detection_from_content_type_header(encoding):
|
|
||||||
responses.add(responses.POST,
|
|
||||||
URL_EXAMPLE,
|
|
||||||
body='Všichni lidé jsou si rovni.'.encode(encoding),
|
|
||||||
content_type=f'text/plain; charset={encoding.upper()}')
|
|
||||||
r = http('--form', 'POST', URL_EXAMPLE)
|
|
||||||
assert 'Všichni lidé jsou si rovni.' in r
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize('encoding', ENCODINGS)
|
|
||||||
@responses.activate
|
|
||||||
def test_POST_encoding_detection_from_content(encoding):
|
|
||||||
responses.add(responses.POST,
|
|
||||||
URL_EXAMPLE,
|
|
||||||
body='Všichni lidé jsou si rovni.'.encode(encoding),
|
|
||||||
content_type='text/plain')
|
|
||||||
r = http('--form', 'POST', URL_EXAMPLE)
|
|
||||||
assert 'Všichni lidé jsou si rovni.' in r
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize('encoding', ENCODINGS)
|
|
||||||
@pytest.mark.parametrize('pretty', PRETTY_MAP.keys())
|
|
||||||
@responses.activate
|
|
||||||
def test_stream_encoding_detection_from_content_type_header(encoding, pretty):
|
|
||||||
responses.add(responses.GET,
|
|
||||||
URL_EXAMPLE,
|
|
||||||
body='<?xml version="1.0"?>\n<c>Financiën</c>'.encode(encoding),
|
|
||||||
stream=True,
|
|
||||||
content_type=f'text/xml; charset={encoding.upper()}')
|
|
||||||
r = http('--pretty=' + pretty, '--stream', 'GET', URL_EXAMPLE)
|
|
||||||
assert 'Financiën' in r
|
|
@ -3,14 +3,11 @@ import sys
|
|||||||
import pytest
|
import pytest
|
||||||
import responses
|
import responses
|
||||||
|
|
||||||
from httpie.constants import UTF8
|
from httpie.encoding import UTF8
|
||||||
from httpie.output.formatters.xml import parse_xml, pretty_xml
|
from httpie.output.formatters.xml import parse_xml, pretty_xml
|
||||||
|
|
||||||
from .fixtures import XML_FILES_PATH, XML_FILES_VALID, XML_FILES_INVALID
|
from .fixtures import XML_FILES_PATH, XML_FILES_VALID, XML_FILES_INVALID, XML_DATA_RAW, XML_DATA_FORMATTED
|
||||||
from .utils import http, URL_EXAMPLE
|
from .utils import http, DUMMY_URL
|
||||||
|
|
||||||
XML_DATA_RAW = '<?xml version="1.0" encoding="utf-8"?><root><e>text</e></root>'
|
|
||||||
XML_DATA_FORMATTED = pretty_xml(parse_xml(XML_DATA_RAW))
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
@ -23,10 +20,14 @@ XML_DATA_FORMATTED = pretty_xml(parse_xml(XML_DATA_RAW))
|
|||||||
)
|
)
|
||||||
@responses.activate
|
@responses.activate
|
||||||
def test_xml_format_options(options, expected_xml):
|
def test_xml_format_options(options, expected_xml):
|
||||||
responses.add(responses.GET, URL_EXAMPLE, body=XML_DATA_RAW,
|
responses.add(
|
||||||
content_type='application/xml')
|
responses.GET,
|
||||||
|
DUMMY_URL,
|
||||||
|
body=XML_DATA_RAW,
|
||||||
|
content_type='application/xml',
|
||||||
|
)
|
||||||
|
|
||||||
r = http('--format-options', options, URL_EXAMPLE)
|
r = http('--format-options', options, DUMMY_URL)
|
||||||
assert expected_xml in r
|
assert expected_xml in r
|
||||||
|
|
||||||
|
|
||||||
@ -42,10 +43,14 @@ def test_valid_xml(file):
|
|||||||
xml_data = file.read_text(encoding=UTF8)
|
xml_data = file.read_text(encoding=UTF8)
|
||||||
expected_xml_file = file.with_name(file.name.replace('_raw', '_formatted'))
|
expected_xml_file = file.with_name(file.name.replace('_raw', '_formatted'))
|
||||||
expected_xml_output = expected_xml_file.read_text(encoding=UTF8)
|
expected_xml_output = expected_xml_file.read_text(encoding=UTF8)
|
||||||
responses.add(responses.GET, URL_EXAMPLE, body=xml_data,
|
responses.add(
|
||||||
content_type='application/xml')
|
responses.GET,
|
||||||
|
DUMMY_URL,
|
||||||
|
body=xml_data,
|
||||||
|
content_type='application/xml',
|
||||||
|
)
|
||||||
|
|
||||||
r = http(URL_EXAMPLE)
|
r = http(DUMMY_URL)
|
||||||
assert expected_xml_output in r
|
assert expected_xml_output in r
|
||||||
|
|
||||||
|
|
||||||
@ -64,10 +69,14 @@ def test_xml_xhtml():
|
|||||||
)
|
)
|
||||||
expected_xml_file = file.with_name(expected_file_name)
|
expected_xml_file = file.with_name(expected_file_name)
|
||||||
expected_xml_output = expected_xml_file.read_text(encoding=UTF8)
|
expected_xml_output = expected_xml_file.read_text(encoding=UTF8)
|
||||||
responses.add(responses.GET, URL_EXAMPLE, body=xml_data,
|
responses.add(
|
||||||
content_type='application/xhtml+xml')
|
responses.GET,
|
||||||
|
DUMMY_URL,
|
||||||
|
body=xml_data,
|
||||||
|
content_type='application/xhtml+xml',
|
||||||
|
)
|
||||||
|
|
||||||
r = http(URL_EXAMPLE)
|
r = http(DUMMY_URL)
|
||||||
assert expected_xml_output in r
|
assert expected_xml_output in r
|
||||||
|
|
||||||
|
|
||||||
@ -78,61 +87,13 @@ def test_invalid_xml(file):
|
|||||||
and none should make HTTPie to crash.
|
and none should make HTTPie to crash.
|
||||||
"""
|
"""
|
||||||
xml_data = file.read_text(encoding=UTF8)
|
xml_data = file.read_text(encoding=UTF8)
|
||||||
responses.add(responses.GET, URL_EXAMPLE, body=xml_data,
|
responses.add(
|
||||||
content_type='application/xml')
|
responses.GET,
|
||||||
|
DUMMY_URL,
|
||||||
|
body=xml_data,
|
||||||
|
content_type='application/xml',
|
||||||
|
)
|
||||||
|
|
||||||
# No formatting done, data is simply printed as-is
|
# No formatting done, data is simply printed as-is.
|
||||||
r = http(URL_EXAMPLE)
|
r = http(DUMMY_URL)
|
||||||
assert xml_data in r
|
assert xml_data in r
|
||||||
|
|
||||||
|
|
||||||
@responses.activate
|
|
||||||
def test_content_type_from_format_options_argument():
|
|
||||||
"""Test XML response with a incorrect Content-Type header.
|
|
||||||
Using the --format-options to force the good one.
|
|
||||||
"""
|
|
||||||
responses.add(responses.GET, URL_EXAMPLE, body=XML_DATA_RAW,
|
|
||||||
content_type='plain/text')
|
|
||||||
args = ('--format-options', 'response.as:application/xml',
|
|
||||||
URL_EXAMPLE)
|
|
||||||
|
|
||||||
# Ensure the option is taken into account only for responses.
|
|
||||||
# Request
|
|
||||||
r = http('--offline', '--raw', XML_DATA_RAW, *args)
|
|
||||||
assert XML_DATA_RAW in r
|
|
||||||
|
|
||||||
# Response
|
|
||||||
r = http(*args)
|
|
||||||
assert XML_DATA_FORMATTED in r
|
|
||||||
|
|
||||||
|
|
||||||
@responses.activate
|
|
||||||
def test_content_type_from_shortcut_argument():
|
|
||||||
"""Test XML response with a incorrect Content-Type header.
|
|
||||||
Using the --format-options shortcut to force the good one.
|
|
||||||
"""
|
|
||||||
responses.add(responses.GET, URL_EXAMPLE, body=XML_DATA_RAW,
|
|
||||||
content_type='text/plain')
|
|
||||||
args = ('--response-as', 'application/xml', URL_EXAMPLE)
|
|
||||||
|
|
||||||
# Ensure the option is taken into account only for responses.
|
|
||||||
# Request
|
|
||||||
r = http('--offline', '--raw', XML_DATA_RAW, *args)
|
|
||||||
assert XML_DATA_RAW in r
|
|
||||||
|
|
||||||
# Response
|
|
||||||
r = http(*args)
|
|
||||||
assert XML_DATA_FORMATTED in r
|
|
||||||
|
|
||||||
|
|
||||||
@responses.activate
|
|
||||||
def test_content_type_from_incomplete_format_options_argument():
|
|
||||||
"""Test XML response with a incorrect Content-Type header.
|
|
||||||
Using the --format-options to use a partial Content-Type without mime type.
|
|
||||||
"""
|
|
||||||
responses.add(responses.GET, URL_EXAMPLE, body=XML_DATA_RAW,
|
|
||||||
content_type='text/plain')
|
|
||||||
|
|
||||||
# The provided Content-Type is simply ignored, and so no formatting is done.
|
|
||||||
r = http('--response-as', 'charset=utf-8', URL_EXAMPLE)
|
|
||||||
assert XML_DATA_RAW in r
|
|
||||||
|
@ -33,7 +33,7 @@ HTTP_OK_COLOR = (
|
|||||||
'\x1b[39m\x1b[38;5;245m \x1b[39m\x1b[38;5;136mOK'
|
'\x1b[39m\x1b[38;5;245m \x1b[39m\x1b[38;5;136mOK'
|
||||||
)
|
)
|
||||||
|
|
||||||
URL_EXAMPLE = 'http://example.org' # Note: URL never fetched
|
DUMMY_URL = 'http://this-should.never-resolve' # Note: URL never fetched
|
||||||
|
|
||||||
|
|
||||||
def mk_config_dir() -> Path:
|
def mk_config_dir() -> Path:
|
||||||
|
Loading…
Reference in New Issue
Block a user