When requesting data from the API, either historic or live, data is returned as an array of objects where each object represents a single item. The object is format as such:
{ "datetime":"12/1/2015 12:00:20 AM", "datetime_raw":42339.2085654630, "value":"1,308,422 KByte", "value_raw":1339823865.0000, "value":"178,703 kbit/s", "value_raw":22337843.6979, "value":"1,271,663 KByte", "value_raw":1302183233.0000, "value":"173,682 kbit/s", "value_raw":21710290.6469, "value":"36,758 KByte", "value_raw":37640632.0000, "value":"5,020 kbit/s", "value_raw":627553.0510, "coverage":"100 %", "coverage_raw":10000 } The problem with this is that when parsed, only the last key is visible. This is because the keys are not unique. It is therefore impossible to parse values from a JSON api result and this should be considered a bug. I'd like to propose the following format, which would be an accurate representation based on the XML structure: { "datetime":"12/1/2015 12:00:20 AM", "datetime_raw":42339.2085654630, "values":[ { "channel": "Traffic Total (volume)", "channelid": -1, "value":"1,308,422 KByte", "value_raw":1339823865.0000 }, { "channel": "Traffic Total (speed)", "channelid": -1, "value":"178,703 kbit/s", "value_raw":22337843.6979 }, { "channel": "Traffic In (volume)", "channelid": 0, "value":"1,271,663 KByte", "value_raw":1302183233.0000 } ...and so forth ], "coverage": "100 %", "coverage_raw": "0000010000" }
The difference here is that the values are separate objects that are contained within an array, thus can be iterated and processed as necessary.
This can currently be done with XML using XSLT to convert to a parseable JSON string, however XML is bulkier and there's an obvious performance penalty.
Add comment