Data Out

It’s just as easy to get data out of a Portal as it is to put data in. This can be done directly from the Portal web page. Or you can use HTTP URL’s to fetch data. The URL can be submitted directly from the address bar of your browser, which will deliver the data in standard formats such as CSV files, JSON files, or plain JSON.

You can also retrieve data using your favorite programming language to construct a program to send URLs and receive data, letting you build analysis and visulaization apps that can process your real-time observations. Using JavaScipt, you can even build widgets and pages that display your data on your own web site.

We will first describe the URL syntax for retrieving data, and follow this with examples that demonstrate how easy it is to integrate your analysis activities with a CHORDS Portal using Python, HTML, IDL, Matlab, R, sh, etc. You get the idea.

URL Syntax

Sample URLs for fetching data from the Portal: is the hostname of your Portal. The fields after “?” are quallifiers, each separated by “&”.

The number following instruments/ is the instrument identifier.

Following the instrument identifier is the format that the data will be returned in (csv, jsf, json or xml).

Some formats result in a data file being returned to you browser, which can be saved in a directory. The other formats directly return text, which can be easily ingested into programs.

Format File or Text Data Product
.csv File Data in a comma-separated-value (CSV) file. CSV files can be opened automatically by spreadsheet programs such as MS Excel.
.jsf File Data in a JSON structured file. Most scripting programs can easily read JSON into a structured variable.
.xml File Data in an eXtensible-Markup-Language (XML) structured file.
.json Text Data in straight JSON format. This format is used to bring data directly into a processing program.

Fields after “?” are quallifier pairs, with each separated by “&”. The qualifiers are optional, and are used to refine the data request.

If time qualifiers are not specified, data for the curent day are returned.

Qualifier Meaning
start=time Start time of the data span, in ISO8061 format.
end=time Start time of the data span, in ISO8061 format.
key=value If the Portal has been configured to require a security key for downloading data, it is specified with the key qualifier. Keys are case sensitive.

Programming Examples

Use the "Data" link on the Portal to fetch daily data files using your browser. Various file formats can be selected.

# Fetch the most recent measurements from a portal
import json, requests
url      = ''
response = requests.get(url=url)
data     = json.loads(response.content)
print json.dumps(data, indent=4, sort_keys=True)
    "Affiliation": "My Organization", 
    "Data": {
        "Time": [
        "b": [
        "p": [
        "rh": [
        "t": [
        "wd": [
        "ws": [
    "Instrument": "Wx Station 3", 
    "Project": "Localhost", 
    "Site": "Boulder"

It appears that there may be cross-domain security issues with this approach. We need to look into this.

<h1>CHORDS HTML/Ajax Example</h1>

<script src=""></script>

var url = "";
$(function () {
    $.getJSON(url, function (data) {
        var table_html = "<table>";
        function (key, val) {
            if (key != "Data") {
                table_html += "<tr><td>" + key + "</td><td>" + val + "</td></tr>";
            } else {
                $.each(val, function (datakey, dataval) {
                    table_html += "<tr><td>Data</td><td>" + datakey 
                      + "</td><td>" + dataval + "</td></tr>";
        table_html += "</table>";

        $("<p>", {
            html: table_html
Result: AJAX
oUrl = OBJ_NEW('IDLnetUrl')
json_data = oUrl->Get(URL=url, /STRING_ARRAY)
data = JSON_PARSE(json_data)
    "Project": "CHORDS Testbed",
    "Site": "NCAR Mesa Lab",
    "Affiliation": "NSF EarthCube",
    "Instrument": "ML Wx Station",
    "Data": {
        "Time": [
        "wdir": [
        "wspd": [
        "wmax": [
        "tdry": [
        "rh": [
        "pres": [
        "raintot": [
        "batv": [
% Read CHORDS JSON data into a Matlab program.
% This code uses the JSONlab toolbox from the Matlab File Exchange.
% (Matlab >= R2014b)
json_data = urlread(url);
inst_data =loadjson(json_data);
inst_data = 

        Project: 'CHORDS Testbed'
           Site: 'NCAR Mesa Lab'
    Affiliation: 'NSF EarthCube'
     Instrument: 'ML Wx Station'
           Data: [1x1 struct]

ans = 

       Time: {'2015-07-28T21:00:51.000Z'}
       wdir: 135
       wspd: 1.4000
       wmax: 4.3000
       tdry: 26.3000
         rh: 24.7000
       pres: 814.6000
    raintot: 453.7000
       batv: 13.9000


url <- ''
data <- fromJSON(txt=url)
[1] "CHORDS Testbed"

[1] "NCAR Mesa Lab"

[1] "NSF EarthCube"

[1] "ML Wx Station"

[1] "2015-07-28T21:35:51.000Z"

[1] 80

[1] 3

[1] 5.8

[1] 25.7

[1] 24.4

[1] 814.4

[1] 453.7

[1] 13.9


echo 'CSV format:'
curl $urlcsv

echo 'JSON format:'
curl $urljson


CSV format:
Project,CHORDS Testbed
Site,NCAR Mesa Lab
Affiliation,NSF EarthCube
Instrument,ML Wx Station
Time,Wind Direction,Wind Speed,Wind Max,Temperature,Humidity,Pressure,Rain Total,Battery
2015-07-28 22:30:51 UTC,75.0,2.5,6.4,26.3,25.3,814.3,453.7,13.9

JSON format:
{"Project":"CHORDS Testbed","Site":"NCAR Mesa Lab","Affiliation":"NSF EarthCube","Instrument":"ML Wx Station","Data":{"Time":["2015-07-28T22:30:51.000Z"],"wdir":[75.0],"wspd":[2.5],"wmax":[6.4],"tdry":[26.3],"rh":[25.3],"pres":[814.3],"raintot":[453.7],"batv":[13.9]}}