You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
| s-/u-/int32<br />s-/fixed32 | `number` (32 bit integer) | `value | 0` if signed<br />`value >>> 0` if unsigned
124
+
| s-/u-/int32<br />s-/fixed32 | `number` (32 bit integer) | `value | 0` if signed<br />`value >>> 0` if unsigned
125
125
| s-/u-/int64<br />s-/fixed64 | `Long`-like (optimal)<br />`number` (53 bit integer) | `Long.fromValue(value)` with long.js<br />`parseInt(value, 10)` otherwise
126
126
| float<br />double | `number` | `Number(value)`
127
127
| bool | `boolean` | `Boolean(value)`
@@ -695,61 +695,52 @@ Additional documentation
695
695
696
696
Performance
697
697
-----------
698
-
The package includes a benchmark that tries to compare performance to native JSON as far as this is possible. On an i7-2600K running node 6.9.1 it yields:
698
+
The package includes a benchmark that compares protobuf.js performance to native JSON (as far as this is possible) and [Google's JS implementation](https://github.com/google/protobuf/tree/master/js). On an i7-2600K running node 6.9.1 it yields:
699
699
700
700
```
701
701
benchmarking encoding performance ...
702
702
703
-
Type.encode to buffer x 547,361 ops/sec ±0.27% (94 runs sampled)
704
-
JSON.stringify to string x 310,848 ops/sec ±0.73% (92 runs sampled)
705
-
JSON.stringify to buffer x 173,608 ops/sec ±1.51% (86 runs sampled)
703
+
protobuf.js (reflect) x 547,366 ops/sec ±1.29% (90 runs sampled)
704
+
protobuf.js (static) x 525,722 ops/sec ±1.17% (91 runs sampled)
705
+
JSON (string) x 311,180 ops/sec ±0.67% (93 runs sampled)
706
+
JSON (buffer) x 183,724 ops/sec ±0.69% (92 runs sampled)
707
+
google-protobuf x 76,337 ops/sec ±0.73% (91 runs sampled)
706
708
707
-
Type.encode to buffer was fastest
708
-
JSON.stringify to string was 43.5% slower
709
-
JSON.stringify to buffer was 68.7% slower
709
+
protobuf.js (reflect) was fastest
710
+
protobuf.js (static) was 3.8% slower
711
+
JSON (string) was 42.8% slower
712
+
JSON (buffer) was 66.2% slower
713
+
google-protobuf was 86.0% slower
710
714
711
715
benchmarking decoding performance ...
712
716
713
-
Type.decode from buffer x 1,294,378 ops/sec ±0.86% (90 runs sampled)
714
-
JSON.parse from string x 291,944 ops/sec ±0.72% (92 runs sampled)
715
-
JSON.parse from buffer x 256,325 ops/sec ±1.50% (90 runs sampled)
717
+
protobuf.js (reflect) x 1,401,958 ops/sec ±0.78% (93 runs sampled)
718
+
protobuf.js (static) x 1,391,017 ops/sec ±0.78% (90 runs sampled)
719
+
JSON (string) x 301,749 ops/sec ±0.88% (93 runs sampled)
720
+
JSON (buffer) x 268,792 ops/sec ±0.84% (90 runs sampled)
721
+
google-protobuf x 186,727 ops/sec ±0.81% (90 runs sampled)
716
722
717
-
Type.decode from buffer was fastest
718
-
JSON.parse from string was 77.4% slower
719
-
JSON.parse from buffer was 80.3% slower
723
+
protobuf.js (reflect) was fastest
724
+
protobuf.js (static) was 0.8% slower
725
+
JSON (string) was 78.5% slower
726
+
JSON (buffer) was 80.8% slower
727
+
google-protobuf was 86.7% slower
720
728
721
729
benchmarking combined performance ...
722
730
723
-
Type to/from buffer x 254,126 ops/sec ±1.13% (91 runs sampled)
724
-
JSON to/from string x 122,896 ops/sec ±1.29% (90 runs sampled)
725
-
JSON to/from buffer x 88,005 ops/sec ±0.87% (89 runs sampled)
731
+
protobuf.js (reflect) x 274,685 ops/sec ±0.99% (89 runs sampled)
732
+
protobuf.js (static) x 278,352 ops/sec ±1.00% (90 runs sampled)
733
+
JSON (string) x 129,638 ops/sec ±0.83% (91 runs sampled)
734
+
JSON (buffer) x 90,904 ops/sec ±0.93% (87 runs sampled)
735
+
google-protobuf x 43,327 ops/sec ±0.89% (90 runs sampled)
726
736
727
-
Type to/from buffer was fastest
728
-
JSON to/from string was 51.7% slower
729
-
JSON to/from buffer was 65.3% slower
730
-
731
-
benchmarking verifying performance ...
732
-
733
-
Type.verify x 6,246,765 ops/sec ±2.00% (87 runs sampled)
734
-
735
-
benchmarking message from object performance ...
736
-
737
-
Type.fromObject x 2,892,973 ops/sec ±0.70% (92 runs sampled)
738
-
739
-
benchmarking message to object performance ...
740
-
741
-
Type.toObject x 3,601,738 ops/sec ±0.72% (93 runs sampled)
737
+
protobuf.js (static) was fastest
738
+
protobuf.js (reflect) was 1.3% slower
739
+
JSON (string) was 53.3% slower
740
+
JSON (buffer) was 67.3% slower
741
+
google-protobuf was 84.4% slower
742
742
```
743
743
744
-
Note that JSON is a native binding nowadays and as such is about as fast as it possibly can get. So, how can protobuf.js be faster?
745
-
746
-
* The benchmark is [somewhat flawed](https://github.com/dcodeIO/protobuf.js/blob/master/bench/index.js).
747
-
* Reader and writer interfaces configure themselves according to the environment to eliminate redundant conditionals.
748
-
* Node-specific reader and writer subclasses benefit from node's buffer binding.
749
-
* Reflection has built-in code generation that builds type-specific encoders, decoders and verifiers at runtime.
750
-
* Encoders and decoders do not implicitly call `verify` on messages to avoid unnecessary overhead where messages are already known to be valid. It's up to the user to call `verify` where necessary.
751
-
* Quite a bit of V8-specific profiling is accountable for everything else.
752
-
753
744
You can also run [the benchmark](https://github.com/dcodeIO/protobuf.js/blob/master/bench/index.js) ...
0 commit comments