I'm near wrapping up an ambitious project to get historical data from 2005 and Virtual Weather Station (VWS) into WeatherCat and subsequently into MySQL for ease of viewing and comparing historical weather data.
Check out
http://TWSheppard.homeip.net/Weather/Viewer.php.
The site may be unstable as I continue to tinker, but hopefully you'll be able to have a look. It's not mobile-phone friendly but I'll think about fixing that.
This page allows me to view historical data for temperature, humidity, wind speed, gust & direction, pressure, inside, and one sensor that resides in the doghouse in winter and the greenhouse in summer. I can scroll back and forth, enter the start or end dates by typing them in or using a datepicker, auto-refresh the current weather, choose spans from one hour, day, week, month, or year. I can also compare with the previous day, week, month, or year as appropriate.
There are some holes in my data due to VWS screwing up near the end of 2010 that I didn't notice until I converted to WC. I always used the Weather Underground to view historical data. But it has a clumsy, slow interface so I decided to build my own.
For those interested in the technical side of things, here are some brief notes ?
This was a huge learning experience for me. I had dabbled in all of the technologies but this project really stretched me and forced me to learn a lot.
I first built the MySQL database using MySQLWorkbench before I abandoned it once it did a destructive sync. I currently use the Sequel Pro app to construct and view the database.
With the structure in place I then wrote a Perl script that processed the old WC .cat files and inserted the historical data into the database tables. There are three tables. The one-minute samples, and the hourly and daily max, min, avg. I found that using the one-minute samples made the plots look ragged when using spans of a month or year.
To get the live data, I installed the awesome Hazel onto the Mac running WC. This is an early 2008 MacBook, 2.4 GHz Core 2 Duo with 4 GB of RAM. Hazel is used to monitor changes to the WC .cat text files. When it finds a file that has changed it triggers a Perl script that reads the data in reverse order parsing the new records and inserting them into the database one-minute samples table. A MySQL trigger then automatically calculates the max, min, avg for the hourly and daily tables.
With the data now funnelling into the database every minute it was time to write the data viewer. This I did in PHP using mysqli plus JavaScript, JQuery, and JQueryui with the Flot plotting library and a couple other JQuery plug-ins. It is absolutely amazing what can be done in JQuery. My site just uses the basics.
Locally, the speed of this is absolutely phenomenal. I haven't tried it over the Internet. Each time you change the graph, span, or adjust the time, about 750 values out of the 2.9
million are returned from the database and plotted. For me, it's instantaneous. Since I know how much work is being done to get those plots up, I'm blown away. This is a five-year old computer doing other work as well, so it's very impressive.
That's it. I'll continue to tinker but I've met my main goal of being able to easily view and compare my historical data.
I need a drink.
![Cheers [cheers1]](https://athena.trixology.com/Smileys/default/food-smiley-004.gif)