I have a high throughput tcp server app for which I'm using json as the message serialization mechanism and json.net (latest 5.06) library to do that.
With recent increase of clients, I've been experiencing a high memory fragmentation with byte. After some tests it seems like that the actual deserialization is contributing big time to the fragmentation.
To deserialize a message, I've using:
using (var stream = new MemoryStream(value, 0, length.Value))
var serializer = JsonSerializer.Create(JSON_SERIALIZER_SETTINGS);
return serializer.Deserialize<TResult>(new JsonTextReader(new StreamReader(stream, encoding)));
where 'value' is byte
Is there any way to optimize this? Is it possible to let tell the JsonSerializer/JsonTextReader to use byte buffers if it needs to create any?