At some point, the big_data.json fixture became out of sync with the
TestModel type, and so decoding was failing, making these tests useless.
I've generated a new big_data.json fixture that matches the expected
TestModel properties, as well as increasing the size to be able to
really measure performance issues. I've also added a test to ensure that
decoding completed successfully so that we don't run into this issue in
the future.
Javascript isn't any good at handling large numbers natively. Instead,
any number over 53 bits is supposed to be stored as a string. In order
to support this, Argo should really add the ability to decode `Int64`
instances from `String`s, in addition to `Number`s.
Turns out, we weren't testing the decoding of dictionaries anywhere.
That seems wrong, so this adds those tests and (thankfully) proves that
our current implementation works as expected.
Also, we ran into a swift bug that I actually hit the other day as well
that I need to file a radar on. The title of which will probably end up
being:
> flatMapping custom monadic type with static function defined in a
> protocol extension results in compiler crash
That's why we're duplicating the meat of `flatMap` for the purpose of
the test.
This also breaks up the main README.md into a Documentations directory
with separate documents for different topics. This will be expanded upon
with new PRs going forward.
Somehow this compiled. I'm legitimately not sure how. Maybe I have an
old version of the framework somewhere that Xcode can find. No clue. But
we don't need these references to Runes sitting around now that we
aren't depending on it.
This operator chooses between either the left hand side or the right
hand side, whichever contains a successful value. If both sides are
empty, the right hand side will be returned.
Usage is as follows:
<*> j <| "possible-key" <|> j <| "possible-key2"
The precedence is set a bit below the precedence of the other operators
(i.e. `<|`) so that parentheses are not necessary when constructing
expressions.
On 32-bit platforms (namely iPhone 4S, iPhone 5 for iOS8+) the Swift Int
type only goes up to 2^31. Even though there is technically no limits
placed upon JSON numeric values, they are often designed around
JavaScript clients which have a maximum integer value of 2^53. I've
added explicit support for the Int64 type to support this limit.
Probably important to mention that if a numeric value of greater than
2^31 is present in the data and you are on a 32-bit device using a plain
Int type, you get a silent overflow on decoding.
Using the update/append functionality of Dictionaries instead of
implementing a custom merge operator `+`, we see significant speed
increases for large data sets. Given a ~85K json file, it took ~265ms to
parse without this optimization and ~75ms with this optimization.
`JSONValue.parse` returns a non-optional `JSONValue`. Unfortunately,
because of Swift Magic, Xcode will happily compile when using `flatMap`
with a function that returns a non-optional. This can (but doesn't
always) result in a EXC_BAD_ACCESS error at runtime. Using `map` here is
actually the correct behavior, but should also fix this crash that
people were seeing.
Introducing this typealias helps when trying to conform a non-final
class to `JSONDecodable`.
If we just use `Self`, we run into an issue where the compiler can't
ensure that return type is correct, because the class might be
subclassed. By letting classes explicitly define their decoded type, we
can make the compiler happy while maintaining the ability to subclass if
that's what we want.
As an added benefit, we can default `DecodedType` to `Self`, which means
that `structs` and `final` classes don't need to worry about changing
anything from the way they are doing it now.
The rest of the system treats JSON decoding errors as fatal, we should
do the same for Arrays. Instead of ignoring decoding errors in the
nested JSON, an error should result in `.None` for the entire array.
This value helps reduce the complexity of type inference.
This commit also adds a first pass to change JSON decoding operators to
be prefixed instead of infix, and uses function composition instead of
function application.