4

I'm trying to serialize C# DataTable using Newtonsoft JsonConverter

Code :

JsonConvert.SerializeObject(dt); //dt is DataTable

Result I got is :

[
  {
    "Name": "Tiger Nixon",
    "Position": "System Architect",
    "Address": "Edinburgh",
    "No": "5421"
  },
  {
    "Name": "Garrett Winters",
    "Position": "Accountant",
    "Address": "Tokyo",
    "No": "8422"
  }
]

Result I want is :

{
  "data": [
    [
      "Tiger Nixon",
      "System Architect",
      "Edinburgh",
      "5421"
    ],
    [
      "Garrett Winters",
      "Accountant",
      "Tokyo",
      "8422"
    ]
  ]
}

Is it possible to custom the output using Newtonsoft? I tried writing my own code to serialize the DataTable by using foreach on the DataTable but the performance is night and day compared to Newtonsoft.

Any help will be appreciated

3
  • 1
    Does this link help you out at all? Here is the direct documentation from Newtonsoft: Custom JsonConverter Commented Apr 30, 2015 at 2:48
  • 1
    Do you only need to serialize, or also deserialize? Without the column names, deserializing to a DataTable looks problematic. Commented Apr 30, 2015 at 3:43
  • @JasonWilczak: work like a charm! But do you know why the performance is very noticeable if compared to my own code. I mean my code is slow because I loop through large DataTable. I did the same thing with Custom JsonConverter but it's very fast. Commented Apr 30, 2015 at 3:53

3 Answers 3

6

You could do it with the following JsonConverter:

public class DataTableTo2dArrayConverter : JsonConverter
{
    public override bool CanConvert(Type objectType)
    {
        return typeof(DataTable).IsAssignableFrom(objectType);
    }

    public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
    {
        throw new NotImplementedException();
    }

    public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
    {
        var table = (DataTable)value;
        var array2d = table.AsEnumerable().Select(row => table.Columns.Cast<DataColumn>().Select(col => row[col]));
        serializer.Serialize(writer, new { data = array2d });
    }
}

And then use it like:

        var settings = new JsonSerializerSettings();
        settings.Converters.Add(new DataTableTo2dArrayConverter());
        var json = JsonConvert.SerializeObject(dt, Formatting.Indented, settings);

Note that my use of System.Data.DataTableExtensions.AsEnumerable() requires a reference to System.Data.DataSetExtensions.dll.

Sign up to request clarification or add additional context in comments.

2 Comments

I did some performance testing and my code is slightly better in performance (miliseconds so it doesn't matter) but yours are much simpler so marked as answer. Thanks!
Thanks. Linq does sometimes add a small performance hit as compared to explicit loops; there might also have been a performance hit when Json.NET reflected the intermediate representation.
3

Here's how I did it after I read the link provided by JasonWilczak

public class JqueryDatatablesConverter : JsonConverter
{

    public override bool CanConvert(Type objectType)
    {
        return typeof(DataTable).IsAssignableFrom(objectType);
    }

    public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
    {
        throw new NotImplementedException();
    }

    public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
    {
        dynamic dt = (DataTable)value;
        dynamic count = dt.Columns.Count - 1;

        writer.WriteStartObject();
        writer.WritePropertyName("data");
        writer.WriteStartArray();

        foreach (DataRow dr in dt.Rows) {
            writer.WriteStartArray();
            for (int x = 0; x <= count; x++) {
                serializer.Serialize(writer, dr[x]);
            }
            writer.WriteEndArray();
        }

        writer.WriteEndArray();
        writer.WriteEndObject();

    }
}

Comments

2

Here is a working fiddle that demonstrates it. For more information, see the Newtonsoft documentation on Custom JsonConverter.

DataTableJsonConverter

Convert a DataTable into a custom JSON string.

public class DataTableJsonConverter : JsonConverter
{
    public override void WriteJson(JsonWriter w, object v, JsonSerializer s)
    {
        w.WriteStartObject();
        w.WritePropertyName("data");
        w.WriteStartArray();
        foreach(DataRow r in (v as DataTable).Rows)
        {
            w.WriteStartArray();
            foreach(var c in r.ItemArray)
            {
                w.WriteValue(c);
            }
            w.WriteEndArray();
        }
        w.WriteEndArray();
        w.WriteEndObject();
    }

    public override object ReadJson(JsonReader r, Type t, object v, JsonSerializer s)
    {
        throw new NotImplementedException("Unnecessary: CanRead is false.");
    }

    public override bool CanRead { get { return false; } }

    public override bool CanConvert(Type objectType)
    {
        return objectType == typeof(DataTable);
    }
}

Here's How to Use It

public class Program
{
    public static void Main()
    {
        var dt = SeedData();

        var json = JsonConvert.SerializeObject(
                dt, Newtonsoft.Json.Formatting.Indented,
                new DataTableJsonConverter());

        Console.WriteLine(json);
    }

    public static DataTable SeedData()
    {
        var dt = new DataTable();
        dt.Columns.Add("Name");
        dt.Columns.Add("Position");
        for (var i = 0; i < 2; ++i)
        {
            dt.Rows.Add(new object[] { "Shaun", "Developer" });
        }
        return dt;
    }
}

Here's Its Output

{
  "data": [
    [
      "Shaun",
      "Developer"
    ],
    [
      "Shaun",
      "Developer"
    ]
  ]
}

Performance

For those that are interested, here is a fork of the fiddle that tries to show the performance of three different methods from me, dbc, and warheat1990 over 1500 data rows and two runs each. They are all very close and for reasons unknown to me, the second run is always faster.

DataTableJsonConverter:6 ms
DataTableJsonConverter:2 ms
DataTableTo2dArrayConverter:251 ms
DataTableTo2dArrayConverter:11 ms
JqueryDatatablesConverter:1580 ms
JqueryDatatablesConverter:16 ms

1 Comment

A few years late but, to answer the question on why the second run is always faster, that's usually due to Speculative Execution and/or Predictive Branching done by the processor. When you run the test the first time, the processor makes uneducated guesses and learns a little more about the work load. The second time around, it knows a lot more about the workload and can make educated guesses on how to best speed up the task. It's also why running a task on ordered data is faster than running a task on unordered data.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.