This page is fairly technical, though you don't need to be a software developer to read it. For a non-technical overview of the API, start here, or see the sign-up page if you already have compatible software that needs API access keys to unlock its degree-day-based functionality.
At present, the API is easiest to use with Java, .NET (e.g. C# and VB.NET), and Python. We've developed robust client libraries for these platforms:
There's also some sample PHP code that will help you get up and running fairly quickly, albeit not as quickly as with the full client libraries above.
All the client libraries run on top of an XML API, which can be accessed using any programming language. The XML API is robust and stable, but it's not very easy to work with. We suggest that you use one of the client libraries above if you can, as they are high-performance, full-featured implementations that are very easy to use and will enable you to be fetching data in minutes.
The links above cover the technical details. The rest of this page gives a higher-level overview of some common approaches to integration:
The simplest integration option is often to pull data from the API as and when it's needed. For example, consider:
For systems like these it often makes sense to fetch degree-day data from the API on demand. If you request data from the geographic location of the target building (by specifying a postal/zip code or longitude/latitude position), the API will automatically generate the data you requested using the weather station that's best placed to supply it (considering data quality and coverage as well as distance from the target location).
This on-demand approach is particularly likely to make sense if:
When the locations of interest remain fairly constant it can often make sense to build and maintain your own local database of degree days. This is a common pattern for:
Your system will probably start by fetching historical data for all of your target locations (e.g. the locations of all the buildings of interest). The API enables you to fetch data from a geographic location (expressed as a postal/zip code or longitude/latitude position). When you fetch data from a geographic location, the API will automatically select the best weather station to use, considering data quality and coverage as well as distance from the target location.
Some stations have been recording temperatures for longer than others, and the quality and coverage of any given station can vary over time. So, when fetching data from a geographic location, make sure to request the full range of historical data that you want, so that the API's station-selection algorithm will know to favour stations with that level of coverage.
If you request more data than any nearby station can supply, the API will, by default, return what it can from within the range you requested. Except in the unlikely event of your geographic location having no active stations near it, recent data should always be available ("recent" meaning to within around 10 days of yesterday in the location's local time zone, and usually up to and including yesterday). And the API will never return data with gaps in it. But there are limits on how far back in time you can go, so you might find data missing from the start of your requested range. If you'd rather receive an error than a partial set of data, you can specify a minimum required range in your request.
To keep your database up to date, it's likely that you'll want to fetch fresh data each day, week or month (or some similar such timescale of your choice).
You can use the geographic locations for updates (and in some cases this might be fine), but bear in mind that the API might choose different stations to the ones it used for the initial fill. Station selection depends on the data you're requesting as well as the location you're requesting it for.
When you initially fetched data from a geographic location, the API would have also returned the station ID that it used to generate the data. If you store each assigned station ID with each geographic location in your database, you can use those IDs to update the stored data as time goes on. Some nearby geographic locations might be assigned the same station ID, so you can save on API requests by fetching updates for shared station IDs just once.
That's it for the overview of building and maintaining a local database of degree days. But it may also be worth ensuring that your system can robustly handle the edge cases explained below.
These edge cases are mainly relevant if you have built a local database of degree days and are keeping it up to date with fresh data as it becomes available. We've thought hard about the edge cases and designed the API to make it as easy as possible for you to handle them. But please don't feel you need to follow our prescription if you don't think it's necessary for your application.
Although a station might be working well today, there's no guarantee that it will still be working well next month or next year. Unfortunately not even the best "airport" stations managed by organizations such as the NOAA or the UK Met Office are exempt from reliability problems.
If you're only storing data from a handful of locations, it might not be worth worrying about the possibility of one of your stations going down. But, if you're storing data from hundreds or thousands of locations, it's likely that you'll run into station downtime at some point, so you might want your system to be prepared for it. We've designed our system to make it as easy as possible for you to handle station downtime in a robust manner.
Small patches of downtime are automatically filled with estimated data. But our system will only do this when it has temperature readings on both sides of the gap. So, if a station goes down for a while, its most recent data won't be available until it comes back up again. Bear this in mind if you're fetching updates at the start of each day/week/month: if a station went down towards the end of the last day/week/month, it will need to come back up again before our system can patch the gap with estimated data and supply a value for that last day/week/month.
If you need the latest data for a given location, but the station used for your initial fill doesn't yet have it, you can put in a request for the missing data from the underlying geographic location of interest. Specify a minimum-required range that includes the latest data, and, provided you haven't got your timezones mixed up, the API will hopefully find a stand-in station that can supply the data you need. If you store the stand-in data, you could replace it later if/when your original station recovers (if you think it's important to use a consistent source).
A long period of downtime will result in a station being labelled "inactive". This happens if a station doesn't report any usable temperature readings for around 10 days or more (10 being an approximate number that is subject to change).
If you try to request data from an inactive station, you'll get a
LocationNotSupported failure. (In .NET or Java this will appear as a
LocationException, in Python a
LocationError.) This is an indication that you should find an alternative station to use as a replacement. Typically this would involve you making another request for data from the geographic location that you ultimately want the data for (e.g. the location of the target building) so that the API can automatically choose a replacement station for you.
There can occasionally be some volatility in the most recent data. Sometimes a station's automated reporting system goes down, then comes back up, leaving a gap in the reported data... As explained above, our system will plug that gap with estimates, but occasionally the station will recover the missing data and report it several days later, enabling our system to calculate the degree days more accurately.
It's unusual for this sort of thing to happen, but it can happen occasionally. Such volatility would generally only affect the latest 10 or so days of data, so it's easy to counteract by fetching a little more data than you need each time you update your database. For example, instead of fetching the latest day, fetch the latest 30 days; instead of fetching the latest week, fetch the latest 4 weeks; instead of fetching the latest month, fetch the latest 2 months. Overwrite any previously-stored values with the most-recently fetched values. The vast majority of the time it will make no difference (and when it does the difference will almost always be small), but this is a good approach to maintaining data quality that doesn't usually add much complexity.
To use the API you need API access keys. These exist so you can ensure that your API account is only used by the people and software systems that you've authorized to use it.
If you're making an internal system, or a public-facing web application that is hosted on servers you control, you will probably only need one API account and one set of access keys. You can securely embed them into your application without much risk of them escaping. That's the simple case.
If you are making installable software (a desktop, mobile, or server application that your users install themselves), it would be unwise to embed your API access keys into that application. If those access keys escape, then anyone that gets hold of them will be able to use up the data-generation capacity that you have reserved for your application. And you won't be able to replace the compromised access keys without redistributing a new version of your application.
One good option is to leave it to your customers to get their own API accounts and enter their access keys into their installations of your software to unlock the degree-day based functionality. Just point them to our sign-up page and tell them where to enter their API access keys once they've subscribed. We have designed our account plans specifically to support this model.
This is a particularly good approach if not all of your customers want the degree-day-based functionality - you can leave it as an optional feature that they can unlock if they want it, or ignore if they don't. It's also a good option if you expect some of your customers to make heavy use of the API (e.g. if you're making software for large multi-site organizations).
Another option is to build an intermediate server. The customer's installed application would request data from your intermediate server, which would fetch the data from our API (using access keys that are securely stored on your server only) and pass it back down to the customer's application. This approach is more complicated, and it introduces a new potential point of failure (the intermediate server), but it does give you tighter control over the licensing side of things.
Running things through an intermediate server makes sense if you're building an installable application for the consumer market, as, although our low-end accounts are attractively priced for businesses, they're not really priced for consumers. If you're selling a $2.99 iPhone app for residential energy tracking, your non-business customers are unlikely to want to pay for an API subscription from us.
Also, with a mass-market consumer application you will find, at scale, that many of your customers will share the same weather stations (as groups of them will live in the same neighborhoods). By routing everything through a server that you control you can cache data locally, reduce your API access by generating data for each station only once, and pass on the cost savings to your customers. With a few thousand customer locations this sort of caching may be more development effort than it's worth for the limited data-reuse it would make possible at that scale, but with hundreds of thousands or millions of customer locations it could certainly be worthwhile for consumer applications where keeping costs down is a priority.
If you're a programmer, we suggest you take a look at the Java quick-start guide, the .NET quick-start guide, the Python quick-start guide, the PHP sample code, or the XML API docs. With the Java, .NET, and Python client libraries you can literally be fetching data from the API within the next few minutes. How best to integrate with the API will probably become a lot clearer once you're familiar with the code itself.