#!/usr/bin/python -u
__version__ = '3.0.0-b1'
'''Data collector/processor for Brultech monitoring devices.

Collect data from Brultech ECM-1240, ECM-1220, and GEM power monitors.  Print
the data, save the data to database, or upload the data to a server.

Includes support for uploading to the following services:
  * MyEnerSave    * SmartEnergyGroups   * pachube        * WattzOn
  * PlotWatt      * PeoplePower         * thingspeak     * Eragy
  * emoncms

Thanks to:
  Amit Snyderman <amit@amitsnyderman.com>
  bpwwer & tenholde from the cocoontech.com forums
  Kelvin Kakugawa
  brian jackson [http://fivejacksons.com/brian/]
  Marc MERLIN <marc_soft@merlins.org> - http://marc.merlins.org/
  Ben <ben@brultech.com>

Example usage:

btmon.py --serial --serialport=/dev/ttyUSB0 -p

Example output:

2010/06/07 21:48:37: ECM ID: 355555
2010/06/07 21:48:37: Volts:                 120.90V
2010/06/07 21:48:37: Ch1 Amps:                0.25A
2010/06/07 21:48:37: Ch2 Amps:                3.24A
2010/06/07 21:48:37: Ch1 Watts:            -124.586KWh ( 1536W)
2010/06/07 21:48:37: Ch1 Positive Watts:    210.859KWh ( 1536W)
2010/06/07 21:48:37: Ch1 Negative Watts:    335.445KWh (    0W)
2010/06/07 21:48:37: Ch2 Watts:            -503.171KWh (    0W)
2010/06/07 21:48:37: Ch2 Positive Watts:      0.028KWh (    0W)
2010/06/07 21:48:37: Ch2 Negative Watts:    503.199KWh (    0W)
2010/06/07 21:48:37: Aux1 Watts:            114.854KWh (  311W)
2010/06/07 21:48:37: Aux2 Watts:             80.328KWh (  523W)
2010/06/07 21:48:37: Aux3 Watts:             13.014KWh (   35W)
2010/06/07 21:48:37: Aux4 Watts:              4.850KWh (    0W)
2010/06/07 21:48:37: Aux5 Watts:             25.523KWh (  137W)


How to specify options:

Options can be specified via command line, in a configuration file, or by
modifying constants in this file.  Use --help to see the complete list
of command-line options.  The configuration file is INI format, with parameter
names corresponding to the command-line options.

Here are contents of a sample configuration file that listens for data on port
8083 then uploads only channel 1 and channel 2 data from device with serial
number 311111 to plotwatt and enersave and saves to a MySQL database:

[general]
serial_read = false
serial_port = COM1
ip_read = true
ip_port = 8083
ip_mode = server

[mysql]
mysql_out = true
mysql_host = localhost
mysql_user = ecmuser
mysql_passwd = ecmpass
mysql_database = ecm

[plotwatt]
plotwatt_out = true
pw_map = 311111_ch1,123,311111_ch2,124
pw_house_id = 1234
pw_api_key = 12345

[enersave]
enersave_out = true
es_map = 311111_ch1,kitchen,2,311111_ch2,solar panels,1


Data Collection:

Data can be collected by serial/usb or tcp/ip.  When collecting via serial/usb,
btmon is always a 'client' to the serial/usb device - btmon blocks until
data appear.  When collecting via tcp/ip, btmon can act as either a server
or client.  When in server mode, btmon blocks until a client connects.  When
in client mode, btmon opens a connection to the server then blocks until data
have been read.


Data Processing:

Data can be printed to standard output, saved to database, and/or sent to one
or more hosted services.  Instructions for each of the services are enumerated
in the following sections.


MySQL Database Configuration:

Specify parameters in a configuration file config.txt:

[mysql]
mysql_out = true
mysql_host     = localhost
mysql_user     = ecm
mysql_passwd   = ecm
mysql_database = ecm

Create the database and table automatically using the configure option:

btmon.py -c config.txt --mysql-config

To manually create the database, do something like this:

mysql -u root -p
mysql> create database ecm;
mysql> grant usage on *.* to ecmuser@ecmclient identified by 'ecmpass';
mysql> grant all privileges on ecm.* to ecmuser@ecmclient;

then create the table 'ecm' by doing something like this:

mysql> create table ecm
    -> (id int primary key auto_increment,
    -> time_created int,
    -> ecm_serial int,
    -> volts float,
    -> ch1_a float,
    -> ch2_a float,
    -> ch1_w float,
    -> ch2_w float,
    -> aux1_w float,
    -> aux2_w float,
    -> aux3_w float,
    -> aux4_w float,
    -> aux5_w float);

Beware that the database will grow unbounded using this configuration.  With
only basic data (volts, amps, watts) from 8 ECM-1240s, the mysql database is
about 450MB after 10 months of continuous operation.

If you do not want the database to grow, then do not create the 'id' primary
key, and make the ecm_serial the primary key without auto_increment.  In that
case the database is used for data transfer, not data capture.


Sqlite Database Configuration:

Specify the filename in a configuration file config.txt:

[sqlite]
sqlite_out = true
sqlite_file = ecm.db

Create the database and table automatically using the configure option:

btmon.py -c config.txt --sqlite-config

To manually create the database, do something like this:

sqlite3 /path/to/database.db
sqlite> create table ecm (id int primary key, time_created int, ...

With database schema 'counters' and GEM48PTBinary packets, the database will
grow by a bit less than 1K for each packet.


OpenEnergyMonitor Configuration:

1) register for an account
2) obtain the API key

Register for an account at the emoncms web site.

Obtain the API key with write access.

By default, all channels on all ECMs will be uploaded.

For example, this configuration will upload all data from all ECMs.

[openenergymonitor]
oem_out=true
oem_token=xxx


WattzOn Configuration:

1) register for an account
2) obtain an API key
3) configure devices that correspond to ECM channels

As of December 2011, it appears that WattzOn service is no longer available.


PlotWatt Configuration:

1) register for an account
2) obtain a house ID and an API key
3) configure meters that correspond to ECM channels

First register for a plotwatt account at www.plotwatt.com.  You should receive
a house ID and an api key.  Then configure 'meters' at the plotwatt web site
that correspond to channels on each ECM.

Using curl to create 2 meters looks something like this:

curl -d "number_of_new_meters=2" http://API_KEY:@plotwatt.com/api/v2/new_meters

Using curl to list existing meters looks something like this:

curl http://API_KEY:@plotwatt.com/api/v2/list_meters

Use the meter identifiers provided by plotwatt when uploading data, with each
meter associated with a channel/aux of an ECM.  For example, to upload data
from ch1 and ch2 from ecm serial 399999 to meters 1000 and 1001, respectively,
use a configuration like this:

[plotwatt]
plotwatt_out=true
pw_house_id=XXXX
pw_api_key=XXXXXXXXXXXXXXXX
pw_map=399999_ch1,1000,399999_ch2,1001


EnerSave Configuration:

1) create an account
2) obtain a token
3) optionally indicate which channels to record and assign labels/types

Create an account at the enersave web site.  Specify 'other' as the device
type, then enter ECM-1240.

To obtain the token, enter this URL in a web browser:

https://data.myenersave.com/fetcher/bind?mfg=Brultech&model=ECM-1240

Define labels for ECM channels as a comma-delimited list of tuples.  Each
tuple contains an id, description, type.  If no map is specified, data from
all channels will be uploaded, generic labels will be assigned, and the type
for each channel will default to net metering.

EnerSave defines the following types:

   0 - load
   1 - generation
   2 - net metering (default)
  10 - standalone load
  11 - standalone generation
  12 - standalone net

For example, via configuration file:

[enersave]
es_token=XXX
es_map=399999_ch1,kitchen,,399999_ch2,solar array,1,399999_aux1,,


PeoplePower Configuration:

1) create an account
2) obtain an activation key
3) register a hub to obtain an authorization token for that hub
4) indicate which ECM channels should be recorded by configuring devices
    associated with the hub

1) Find an iPhone or droid, run the PeoplePower app, register for an account.

2) When you register for an account, enter TED as the device type.  An
activation key will be sent to the email account used during registration.

3) Use the activation key to obtain a device authorization token.  Create an
XML file with the activation key, a 'hub ID', and a device type.  One way to do
this is to treat the btmon script as the hub and each channel of each ECM
as a device.  Use any number for the hub ID, and a device type 1004 (TED-5000).

For example, create the file req.xml with the following contents:

<request>
  <hubActivation>
    <activationKey>xxx:yyyyyyyyyyyyyyy</activationKey>
    <hubId>1</hubId>
    <deviceType>1004</deviceType>
  </hubActivation>
</request>

Send the file using HTTP POST to the hub activation URL:

curl -X POST -d @req.xml http://esp.peoplepowerco.com/espapi/rest/hubActivation

You should get a response with resultCode 0 and a deviceAuthToken.  The
response also contains pieces of the URL that you should use for subsequent
configuration.  For example:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<response>
  <resultCode>0</resultCode>
  <host>esp.peoplepowerco.com</host>
  <useSSL>true</useSSL>
  <port>8443</port>
  <uri>deviceio/ml</uri>
  <deviceAuthToken>XXXXX</deviceAuthToken>
</response>

which results in this URL:

https://esp.peoplepowerco.com:8443/deviceio/ml

4) Add each channel of each ECM as a new ppco device.  The btmon script
will configure the device definitions, but it requires a map of ECM channels
to device names.  This map also indicates the channel(s) from which data
should be uploaded.

For example, via configuration file:

[peoplepower]
peoplepower_out=true
pp_url=https://esp.peoplepowerco.com:8443/deviceio/ml
pp_token=XXX
pp_hub_id=YYY
pp_map=399999_ch1,999c1,399999_aux2,999a2

Additional notes for PeoplePower:

Use the device type 1005, which is a generic TED MTU.  Create an arbitrary
deviceId for each ECM channel that will be tracked.  Apparently the deviceId
must contain hex characters, so use c1 and c2 for channels 1 and 2 and use aN
for the aux.  The seq field is a monotonically increasing nonce.

<?xml version="1" encoding="UTF-8"?>
<h2s seq="1" hubId="1" ver="2">
  <add deviceId="3XXXXXc1" deviceType="1005" />
  <add deviceId="3XXXXXc2" deviceType="1005" />
  <add deviceId="3XXXXXa1" deviceType="1005" />
  <add deviceId="3XXXXXa2" deviceType="1005" />
  <add deviceId="3XXXXXa3" deviceType="1005" />
</h2s>

Send the file to the URL received in the activation response.

curl -H "Content-Type: text/xml" -H "PPCAuthorization: esp token=TOKEN" -d @add.xml https://esp.peoplepowerco.com:8443/deviceio/ml

To get a list of devices that ppco recognizes:
  curl https://esp.peoplepowerco.com/espapi/rest/deviceTypes


Eragy Configuration:

1) register power sensor at the eragy web site
2) obtain a token
3) create an account

The Eragy web site only knows about TED, Blueline, and eGauge devies.  Register
as a TED5000 power sensor.  Eragy will provide a URL and token.  Use curl to
enter the registration for each ECM.  Create a file req.xml with the request:

<ted5000Activation>
  <Gateway>XXX</Gateway>
  <Unique>TOKEN</Unique>
</ted5000Activation>

where XXX is an arbitrary gateway ID and TOKEN is the token issued by Eragy.
If you have only one ECM, use the ECM serial number as the gateway ID.  If
you have multiple ECM, use one of the ECM serial numbers or just pick a number.

Send the request using curl:

curl -X POST -d @req.xml http://d.myeragy.com/energyremote.aspx

The server will respond with something like this:

<ted5000ActivationResponse>
  <PostServer>d.myeragy.com</PostServer>
  <UseSSL>false</UseSSL>
  <PostPort>80</PostPort>
  <PostURL>/energyremote.aspx</PostURL>
  <SSLKey></SSLKey>
  <AuthToken>TOKEN</AuthToken>
  <PostRate>1</PostRate>
</ted5000ActivationResponse>

At the eragy web site, click the 'find my sensor' button.  Eragy assumes that
the energy sensor will immediately start sending data to eragy, so start
running btmon.

On the eragy web site, continue and create an account.  Eragy will email to
you an account password which you must enter to complete the account.  Then
configure the account with a name, timezone, etc. and password.

The eragy configuration would be:

[eragy]
eragy_out=true
eg_token=TOKEN
eg_gateway_id=XXX


Smart Energy Groups Configuration:

1) create an account
2) create a site
3) obtain the site token
4) optionally indicate which ECM channels should be recorded and assign labels

Create an account at the smart energy groups web site.

Create a site at the smart energy groups web site.

Obtain the site token at the smart energy groups web site.

Create devices on the smart energy groups web site.  Create one device per ECM.
For each device create 14 streams - one power stream and one energy stream for
each ECM channel.  Define a node name for each device based on the following.

By default, data from all channels on all ECM will be uploaded.  The node
name is the obfuscated ECM serial number, for example XXX123 for the serial
number 355123. The stream name is p_* or e_* for each channel for power or
energy, respectively. For example, p_ch1, e_ch1, p_aux1, e_aux1

To upload only a portion of the data, or to use names other than the defaults,
specify a map via command line or configuration file.  It is easiest to use the
default node names then add longer labels at the Smart Energy Groups web site.

For example, here is a configuration that uploads data only from ch1 and aux2
from ECM 399999, using node names p_ch1, e_ch1, p_lighting, and e_lighting.

[smartenergygroups]
smartenergygroups_out=true
seg_token=XXX
seg_map=399999_ch1,,399999_aux2,lighting


ThingSpeak Configuration:

1) create an account
2) create a thingspeak channel for each ECM
3) create a field for each ECM channel

Create an account at the ThingSpeak web site.

Create a ThingSpeak channel for each ECM.  Obtain the token (write api key) for
each channel.

Create a field for each ECM data from which you will upload data.

By default, data from all channels on all ECMs will be uploaded.  The channel
ID and token must be specified for each ECM.  By default, the ECM channels
will be uploaded to fields 1-7.

For example, this configuration will upload all data from ECM with serial
399999 to thingspeak channel with token 12345 and from ECM with serial
399998 to thingspeak channel with token 12348.

[thingspeak]
thingspeak_out=true
ts_tokens=399999,12345,399998,12348

This configuration will upload only ch1 from 399999 to field 3 and aux5 from
399998 to field 8:

[thingspeak]
thingspeak_out=true
ts_tokens=399999,12345,399998,12348
ts_fields=399999_ch1,3,399998_aux5,8


Pachube Configuration:

1) create an account
2) obtain API key
3) create a feed

Create an account at the Pachube web site.

Obtain the API key from the Pachube web site.

Create a feed at the Pachube web site, or using curl as described at pachube.

By default, data from every channel from every ECM will be uploaded to a single
pachube feed.

[pachube]
pachube_out=true
pbe_token=XXXXXX
pbe_feed=3000


Changelog:

- 3.0.0  02oct12 mwall
* initial release, based on ecmread 2.4.5

'''
__author__ = 'Brian Jackson; Kelvin Kakugawa; Marc MERLIN; ben; mwall'

# processing of data
#
# Be sure to get the combination of buffer size, sampling time, and upload
# period correct.  Packet processing happens after each read.  So if the device
# is set to emit packets every 5 seconds, then processing will happen every 5
# seconds.  If processing takes more than 5 seconds, one or more reads may be
# missed (depending on lower-level buffering).  Note that the opportunity to
# process happens after each read, but actual processing does not always happen
# after each read.  Some processors will process after each each read, others
# wait for data to collect before processing it.
#
# Ensure that no processor will take too long.  If a processor takes longer
# than a sampling period, data from the next sample will probably be lost.  If
# a processor waits longer than the sampling period times the buffer size, then
# some samples will not be uploaded.
#
# For example, a safe configuration would be an ECM that emits data every 10
# seconds, a mysql processor that saves to a database on the localhost every
# 60 seconds, and a buffer size of 6.
#
# If network speeds are slow and/or hosted services are slow or unreliable,
# run a second instance of this program.  Run one instance to collect data,
# and a second instance to upload data.
#
# serial numbers
#
# For the green-eye, the serial number is 8 characters long - a 5 character
# serial plus a 3 character unit id.
#
# For the ecm-1240, the serial number is 6 characters long - a 5 character
# serial plus a 1 character unit id.
#
# However, the GEM unit id may be more than one character, so serials that
# a GEM generates in ECM packets may have to be longer than 6 characters.
#
# packets
#
# There are three kinds of packets: raw, compiled, and calculated.
# Raw packets are what we get from the device.  They may be binary,
# they may be ascii, they may ECM-1240 format, they may be GEM format,
# or they may be any of many other formats.
#
# Compiled packets have been augmented with labels such as 'ch1_aws'.  The
# labels are associated with the raw data but no calculations have been done.
# Compiling a packet requires only a raw packet.
#
# Calculated packets contain labels with derivative values such as 'ch1_w'.
# Calculating a packet requires two compiled packets, since many of the
# calculated values are differences.
#
# calculation of watts and watt-hours
#
# The brultech devices record only a cumulative measure - the total number of
# watt-seconds, which is a measure of energy.  From this there are two measures
# typically desired - instantaneous power use (e.g. in Watts), and cumulative
# energy use (e.g. in Kilowatt-Hours).
#
# Most services accept instantaneous measures of power.
#
# Some services require a cumulative measure of energy.  Others require a
# differential measure of energy - how much energy consumed since the last
# reading.  This can be problematic when uploads fail, power goes out, or
# other situations arise where data cannot be uploaded.  The cumulative
# reading is less error-prone in these situations.
#
# what this implementation uses:
#   seconds = sec_counter1 - sec_counter0
#   w1 = (abs_ws1 - abs_ws0) / seconds        if pw == 0 multiply by -1
#   pos_w1 = (pol_ws1 - pol_ws0) / seconds
#   neg_w1 = w - pw
#   pos_wh1 = pws1 / 3600
#   neg_wh1 = (abs_ws1 - pol_ws1) / 3600
#   wh1 = pos_wh - neg_wh                     same as (2*pws1 - aws1) / 3600
#   delta_wh1 = wh1 - wh0
#
# TODO: read multiple virtual ecm1240 packets, not just first one
# TODO: btsniff - dump packets
# TODO: support polling mode as well as ecm-push mode
# TODO: figure out how to deal robustly with counter resets and overflows

MINUTE	= 60
SpH = 3600.0

# if set to 1, print out what would be uploaded but do not do the upload.
SKIP_UPLOAD = 0

# if set to 1, obfuscate any serial number before uploading
OBFUSCATE_SERIALS = 1

# packet formats
PF_ECM1220BIN = 'ecm1220bin'
PF_ECM1240BIN = 'ecm1240bin'
PF_GEM48PTBIN = 'gem48ptbin'
PF_GEM48PBIN = 'gem48pbin'
PACKET_FORMATS = [PF_ECM1220BIN, PF_ECM1240BIN, PF_GEM48PTBIN, PF_GEM48PBIN]
DEFAULT_PACKET_FORMAT = PF_ECM1240BIN

# the database schema
DB_SCHEMA_COUNTERS = 'counters'
DB_SCHEMA_ECMREAD = 'ecmread-basic'
DB_SCHEMA_ECMREAD_EXTENDED = 'ecmread-extended'
DB_SCHEMAS = [DB_SCHEMA_COUNTERS, DB_SCHEMA_ECMREAD, DB_SCHEMA_ECMREAD_EXTENDED]
DEFAULT_DB_SCHEMA = DB_SCHEMA_COUNTERS

# channel filters
FILTER_PE_LABELS = 'pelabels'
FILTER_POWER = 'power'
FILTER_ENERGY = 'energy'
FILTER_PULSE = 'pulse'
FILTER_SENSOR = 'sensor'
FILTER_DB_SCHEMA_COUNTERS = DB_SCHEMA_COUNTERS
FILTER_DB_SCHEMA_ECMREAD = DB_SCHEMA_ECMREAD
FILTER_DB_SCHEMA_ECMREAD_EXTENDED = DB_SCHEMA_ECMREAD_EXTENDED

# size of the rolling buffer into which data are cached
# should be at least max(upload_period) / sample_period
DEFAULT_BUFFER_SIZE = 600

# how long to wait before considering an upload to have failed, in seconds
DEFAULT_UPLOAD_TIMEOUT = 15

# how often to upload data, in seconds
# this may be overridden by specific services
DEFAULT_UPLOAD_PERIOD = 15*MINUTE

# serial settings
# the com/serial port to which device is connected (COM4, /dev/ttyS01, etc)
SERIAL_PORT = "/dev/ttyUSB0"
SERIAL_BAUD = 19200

# ethernet settings
# the etherbee defaults to pushing data to port 8083
# the wiz110rs defaults to listening on port 5000
IP_HOST = ''      # for client use the hostname/address of the data server
IP_PORT = 8083    # for client use the port of the data server
IP_TIMEOUT = 60
IP_DEFAULT_MODE = 'server'

# database defaults
DB_HOST          = 'localhost'
DB_USER          = ''
DB_PASSWD        = ''
DB_DATABASE      = 'ecm'
DB_TABLE         = 'ecmdata'
DB_INSERT_PERIOD = MINUTE     # how often to record to database, in seconds
DB_POLL_INTERVAL = 30         # how often to poll the database, in seconds
DB_FILENAME      = 'ecm.db'   # filename for sqlite databases

# rrd defaults
# the rrd files should be updated when new output is available from the device
RRD_DIR = '/var/btmon/rrd'
RRD_STEP = 30 # how often to update the rrd files, in seconds
RRD_HEARTBEAT = 60 # seconds, typically twice the step
# 30s, 5m, 30m, 1h
# 4d at 30s, 60d at 5m, 365d at 30m, 730d at 1h
# 1534876 bytes per rrd file - 162000K total for gem48
RRD_STEPS = [1,6,60,120]
RRD_RESOLUTIONS = [11520, 17280, 17520, 17520]
# 71644 bytes per rrd file - 7776K total for gem48
#RRD_STEPS = [1,6,24,288]
#RRD_RESOLUTIONS = [600,700,775,797]
# 1652596 bytes per rrd file - 174528K total for gem48
#RRD_STEPS = [1,6,24,288]
#RRD_RESOLUTIONS = [17280, 17520, 32850, 1095]

# WattzOn defaults
# the map is a comma-delimited list of channel,meter pairs.  for example:
#   311111_ch1,living room,311112_ch1,parlor,311112_aux4,kitchen
WATTZON_API_URL       = 'http://www.wattzon.com/api/2009-01-27/3'
WATTZON_UPLOAD_PERIOD = MINUTE
WATTZON_TIMEOUT       = 15 # seconds
WATTZON_MAP     = ''
WATTZON_API_KEY = 'apw6v977dl204wrcojdoyyykr'
WATTZON_USER    = ''
WATTZON_PASS    = ''

# PlotWatt defaults
#   https://plotwatt.com/docs/api
#   Recommended upload period is one minute to a few minutes.  Recommended
#   sampling as often as possible, no faster than once per second.
# the map is a comma-delimited list of channel,meter pairs.  for example:
#   311111_ch1,1234,311112_ch1,1235,311112_aux4,1236
PLOTWATT_BASE_URL      = 'http://plotwatt.com'
PLOTWATT_UPLOAD_URL    = '/api/v2/push_readings'
PLOTWATT_UPLOAD_PERIOD = MINUTE
PLOTWATT_TIMEOUT       = 15 # seconds
PLOTWATT_MAP           = ''
PLOTWATT_HOUSE_ID      = ''
PLOTWATT_API_KEY       = ''

# EnerSave defaults
#   Minimum upload interval is 60 seconds.
#   Recommended sampling interval is 2 to 30 seconds.
# the map is a comma-delimited list of channel,description,type tuples
#   311111_ch1,living room,2,311112_ch2,solar,1,311112_aux4,kitchen,2
ES_URL           = 'http://data.myenersave.com/fetcher/data'
ES_UPLOAD_PERIOD = MINUTE
ES_TIMEOUT       = 60 # seconds
ES_TOKEN         = ''
ES_MAP           = ''
ES_DEFAULT_TYPE  = 2

# PeoplePower defaults
#   http://developer.peoplepowerco.com/docs
#   Recommended upload period is 15 minutes.
# the map is a comma-delimited list of channel,meter pairs.  for example:
#   311111_ch1,1111c1,311112_ch1,1112c1,311112_aux4,1112a4
PPCO_URL            = 'https://esp.peoplepowerco.com:8443/deviceio/ml'
#PPCO_UPLOAD_PERIOD  = 15 * MINUTE
PPCO_UPLOAD_PERIOD  = MINUTE
PPCO_TIMEOUT        = 15 # seconds
PPCO_TOKEN          = ''
PPCO_HUBID          = 1
PPCO_MAP            = ''
PPCO_FIRST_NONCE    = 1
PPCO_DEVICE_TYPE    = 1005

# eragy defaults
ERAGY_URL           = 'http://d.myeragy.com/energyremote.aspx'
ERAGY_UPLOAD_PERIOD = 15 * MINUTE
ERAGY_TIMEOUT       = 15 # seconds
ERAGY_GATEWAY_ID    = ''
ERAGY_TOKEN         = ''

# smart energy groups defaults
# the map is a comma-delimited list of channel,meter pairs.  for example:
#   311111_ch1,living room,311112_ch1,parlor,311112_aux4,kitchen
SEG_URL           = 'http://api.smartenergygroups.com/api_sites/stream'
SEG_UPLOAD_PERIOD = MINUTE
SEG_TIMEOUT       = 15 # seconds
SEG_TOKEN         = ''
SEG_MAP           = ''

# thingspeak defaults
#   Uploads are limited to no more than every 15 seconds per channel.
TS_URL           = 'http://api.thingspeak.com/update'
TS_UPLOAD_PERIOD = MINUTE
TS_TIMEOUT       = 15 # seconds
TS_TOKENS        = ''
TS_FIELDS        = ''

# pachube defaults
PBE_URL           = 'http://api.pachube.com/v2/feeds'
PBE_UPLOAD_PERIOD = MINUTE
PBE_TIMEOUT       = 15 # seconds
PBE_TOKEN         = ''
PBE_FEED          = ''

# open energy monitor emoncms defaults
OEM_URL           = 'https://localhost/emoncms3/api/post'
OEM_UPLOAD_PERIOD = MINUTE
OEM_TIMEOUT       = 15 # seconds
OEM_TOKEN         = ''


import base64
import bisect
import new
import optparse
import socket
import os
import sys
import time
import traceback
import urllib
import urllib2

import warnings
warnings.filterwarnings('ignore', category=DeprecationWarning) # MySQLdb in 2.6

# External (Optional) Dependencies
try:
    import serial
except Exception, e:
    serial = None

try:
    import MySQLdb
except Exception, e:
    MySQLdb = None

try:
    from sqlite3 import dbapi2 as sqlite
except Exception, e:
    sqlite = None

try:
    import rrdtool
except Exception, e:
    rrdtool = None

try:
    import cjson as json
    # XXX: maintain compatibility w/ json module
    setattr(json, 'dumps', json.encode)
    setattr(json, 'loads', json.decode)
except Exception, e:
    try:
        import simplejson as json
    except Exception, e:
        import json

try:
    import ConfigParser
except Exception, e:
    ConfigParser = None


class CounterResetError(Exception):
    def __init__(self, msg):
        self.msg = msg
    def __str__(self):
        return repr(self.msg)

class ReadError(Exception):
    def __init__(self, msg):
        self.msg = msg
    def __str__(self):
        return repr(self.msg)

# logging and error reporting
#
# note that setting the log level to debug will affect the application
# behavior, especially when sampling the serial line, as it changes the
# timing of read operations.
LOG_ERROR = 0
LOG_WARN  = 1
LOG_INFO  = 2
LOG_DEBUG = 3
LOGLEVEL  = 2

def dbgmsg(msg):
    if LOGLEVEL >= LOG_DEBUG:
        logmsg(msg)

def infmsg(msg):
    if LOGLEVEL >= LOG_INFO:
        logmsg(msg)

def wrnmsg(msg):
    if LOGLEVEL >= LOG_WARN:
        logmsg(msg)

def errmsg(msg):
    if LOGLEVEL >= LOG_ERROR:
        logmsg(msg)

def logmsg(msg):
    ts = fmttime(time.localtime())
    print "%s %s" % (ts, msg)

def fmttime(seconds):
    return time.strftime("%Y/%m/%d %H:%M:%S", seconds)

# Helper Functions

def getgmtime():
    return int(time.time())

def cleanvalue(s):
    '''ensure that values read from configuration file are sane'''
    s = s.replace('\n', '')    # we never want newlines
    s = s.replace('\r', '')    # or carriage returns
    if s.lower() == 'false':
        s = False
    elif s.lower() == 'true':
        s = True
    return s

def pairs2dict(s):
    '''convert comma-delimited name,value pairs to a dictionary'''
    items = s.split(',')
    m = {}
    for k, v in zip(items[::2], items[1::2]):
        m[k] = v
    return m

def obfuscate_serial(sn):
    '''obfuscate a serial number - expose only the last 3 digits'''
    if OBFUSCATE_SERIALS:
        n = len(sn)
        s = 'XXX%s' % sn[n-3:n]
    else:
        s = sn
    return s

def compare_packet_times(a, b):
    return cmp(a['time_created'], b['time_created'])


# Packet classes

class BasePacket(object):
    def __init__(self):
        self.SEC_COUNTER_MAX   = 16777216
        self.START_HEADER0     = 254
        self.START_HEADER1     = 255
        self.END_HEADER0       = 255
        self.END_HEADER1       = 254
        self.DATA_BYTES_LENGTH = 0 # must be defined by derived class
        self.PACKET_ID         = 0 # must be defined by derived class

    def _convert(self, bytes):
        return reduce(lambda x,y:x+y[0] * (256**y[1]), zip(bytes,xrange(len(bytes))),0)

    def _serialraw(self, packet):
        '''extract the serial number from a raw packet'''
        return '0000'

    def _getserial(self, packet):
        '''get the serial number from a compiled packet'''
        return self._fmtserial(packet['unit_id'], packet['ser_no'])

    def _fmtserial(self, id, sn):
        return '%03d%05d' % (id, sn)

    def _calculate_checksum(self, packet, id):
        '''calculate the packet checksum'''
        checksum = self.START_HEADER0
        checksum += self.START_HEADER1
        checksum += id
        checksum += sum(packet)
        checksum += self.END_HEADER0
        checksum += self.END_HEADER1
        return checksum & 0xff

    def _read2(self, collector, pktlen, pktid):
        m = self.DATA_BYTES_LENGTH + 6
        n = 1024
        data = collector.readbytes(1)
        if not data:
            raise ReadError('no data')
        data = data + collector.readbytes(n)
        if len(data) < m:
            raise ReadError(ord(data))

        packets = []
        packet = [ord(c) for c in data]
        i = 0
        while i+m < len(packet):
            if packet[i] == self.START_HEADER0 and packet[i+1] == self.START_HEADER1:
                if packet[i+2] == pktid:
                    if packet[i+3+self.DATA_BYTES_LENGTH] == self.END_HEADER0 and packet[i+4+self.DATA_BYTES_LENGTH] == self.END_HEADER1:
                        pkt = packet[i+3:self.DATA_BYTES_LENGTH]
                        cs = self._calculate_checksum(pkt, pktid)
                        if packet[i+5+self.DATA_BYTES_LENGTH] == cs:
                            packets.append(pkt)
                        else:
                            dbgmsg('bad checksum')
                    else:
                        dbgmsg('no match for end headers')
                else:
                    dbgmsg('wrong packet id')
            else:
                dbgmsg('no match for start headers')
            i += 1

        dbgmsg('found %d packets' % len(packets))
        return packets

    def _read1(self, collector, pktlen, pktid):
        data = collector.readbytes(1)
        byte = ord(data)
        if byte != self.START_HEADER0:
            raise ReadError("expected START_HEADER0 %s, got %s" %
                            (hex(self.START_HEADER0), hex(byte)))

        data = collector.readbytes(1)
        byte = ord(data)
        if byte != self.START_HEADER1:
            raise ReadError("expected START_HEADER1 %s, got %s" %
                            (hex(self.START_HEADER1), hex(byte)))

        data = collector.readbytes(1)
        byte = ord(data)
        if byte != pktid:
            raise ReadError("expected PACKET_ID %s, got %s" %
                            (hex(pktid), hex(byte)))

        packet = ''
        while len(packet) < pktlen:
            data = collector.readbytes(pktlen-len(packet))
            if not data: # No data left
                raise ReadError('no data after %d bytes' % len(packet))
            packet += data
 
        if len(packet) < pktlen:
            raise ReadError("incomplete packet: expected %d bytes, got %d" %
                            (pktlen, len(packet)))

        data = collector.readbytes(1)
        byte = ord(data)
        if byte != self.END_HEADER0:
            raise ReadError("expected END_HEADER0 %s, got %s" %
                            (hex(self.END_HEADER0), hex(byte)))

        data = collector.readbytes(1)
        byte = ord(data)
        if byte != self.END_HEADER1:
            raise ReadError("expected END_HEADER1 %s, got %s" %
                            (hex(self.END_HEADER1), hex(byte)))

        pkt = [ord(c) for c in packet]

        # if the checksum is incorrect, ignore the packet
        checksum = self._calculate_checksum(pkt, pktid)
        data = collector.readbytes(1)
        byte = ord(data)
        if byte != checksum:
            raise ReadError("bad checksum for %s: expected %s, got %s" %
                            (self._serialraw(packet),hex(checksum),hex(byte)))

        return [pkt]

    def _calc_pe(self, tag, ds, ret, prev):
        '''calculate the power and energy values for a packet'''

        ret[tag+'_w'] = (ret[tag+'_aws'] - prev[tag+'_aws']) / ds
        ret[tag+'_pw'] = (ret[tag+'_pws'] - prev[tag+'_pws']) / ds
        ret[tag+'_nw'] = ret[tag+'_w'] - ret[tag+'_pw']

        # The polarized count goes up only if the sign is positive, so use the
        # value of polarized count to determine the sign of overall watts
        if (ret[tag+'_pw'] == 0):
            ret[tag+'_w'] *= -1 

        # Absolute watt counter increases no matter which way the current goes
        # Polarized watt counter only increase if the current is positive
        # Every polarized count registers as an absolute count
        ret[tag+'_pwh'] = ret[tag+'_pws'] / SpH
        ret[tag+'_nwh'] = (ret[tag+'_aws'] - ret[tag+'_pws']) / SpH
        ret[tag+'_wh'] = ret[tag+'_pwh'] - ret[tag+'_nwh']
#        ret[tag+'_wh'] = (2*ret[tag+'_pws'] - ret[tag+'_aws']) / SpH

        # calculate the watt-hour delta
        prev_dwh = (2*prev[tag+'_pws'] - prev[tag+'_aws']) / SpH
        ret[tag+'_dwh'] = ret[tag+'_wh'] - prev_dwh

    def channels(self, filter):
        '''return a list of data sources for this packet type'''
        return []

    def read(self, collector):
        '''read data from collector, return a raw packet'''
        return self._read1(collector, self.DATA_BYTES_LENGTH, self.PACKET_ID)

    def compile(self, packet):
        '''convert a raw packet into a compiled packet'''
        pass

    def calculate(self, newer, older):
        '''calculate difference between two packets'''
        pass

    def printPacket(self, packet):
        pass

class ECMBinaryPacket(BasePacket):
    def __init__(self):
        BasePacket.__init__(self)

    def _getresetcounter(self, byte):
        '''extract the reset counter from a byte'''
        return byte & 0b00000111      # same as 0x07

    def _serialraw(self, packet):
        sn1 = ord(packet[26:27])
        sn2 = ord(packet[27:28]) * 256
        id1 = ord(packet[29:30])
        return self._fmtserial(id1, sn1+sn2)

    def _fmtserial(self, id, sn):
        '''ECM serial numbers are 6 characters - unit id then serial'''
        s = '%d' % id
        return s[-1:] + '%05d' % sn

class ECM1220BinaryPacket(ECMBinaryPacket):
    def __init__(self):
        ECMBinaryPacket.__init__(self)
        self.PACKET_ID         = 1
        self.DATA_BYTES_LENGTH = 37  # does not include the start/end headers
        self.NUM_CHAN = 2

    def channels(self, filter):
        c = []
        if filter == FILTER_PE_LABELS:
            c = ['ch1', 'ch2']
        elif filter == FILTER_POWER:
            c = ['ch1_w', 'ch2_w']
        elif filter == FILTER_ENERGY:
            c = ['ch1_wh', 'ch2_wh']
        elif filter == FILTER_DB_SCHEMA_COUNTERS:
            c = ['volts', 'ch1_a', 'ch2_a', 'ch1_aws', 'ch2_aws', 'ch1_pws', 'ch2_pws']
        return c

    def compile(self, rpkt):
        '''compile a raw packet into a compiled packet'''
        cpkt = {}

        # Voltage Data (2 bytes)
        cpkt['volts'] = 0.1 * self._convert(rpkt[1::-1])

        # CH1-2 Absolute Watt-Second Counter (5 bytes each)
        cpkt['ch1_aws'] = self._convert(rpkt[2:7])
        cpkt['ch2_aws'] = self._convert(rpkt[7:12])

        # CH1-2 Polarized Watt-Second Counter (5 bytes each)
        cpkt['ch1_pws'] = self._convert(rpkt[12:17])
        cpkt['ch2_pws'] = self._convert(rpkt[17:22])

        # Reserved (4 bytes)

        # Device Serial Number (2 bytes)
        cpkt['ser_no'] = self._convert(rpkt[26:28])

        # Reset and Polarity Information (1 byte)
        cpkt['flag'] = self._convert(rpkt[28:29])

        # Device Information (1 byte)
        cpkt['unit_id'] = self._convert(rpkt[29:30])

        # CH1-2 Current (2 bytes each)
        cpkt['ch1_a'] = 0.01 * self._convert(rpkt[30:32])
        cpkt['ch2_a'] = 0.01 * self._convert(rpkt[32:34])

        # Seconds (3 bytes)
        cpkt['secs'] = self._convert(rpkt[34:37])

        # Use the current time as the timestamp
        cpkt['time_created'] = getgmtime()

        # Add a formatted serial number
        cpkt['serial'] = self._getserial(cpkt)

        return cpkt

    def calculate(self, now, prev):
        '''calculate watts and watt-hours from watt-second counters'''

        # if reset counter has changed since last packet, skip the calculation
        c0 = self._getresetcounter(prev['flag'])
        c1 = self._getresetcounter(now['flag'])
        if c1 != c0:
            raise CounterResetError("old: %d new: %d" % (c0, c1))
        
        ret = now

        # handle seconds counter overflow
        if ret['secs'] < prev['secs']:
            ret['secs'] += self.SEC_COUNTER_MAX
        ds = float(ret['secs'] - prev['secs'])

        # FIXME: detect then handle counter overflow
        # it is easy enough to detect, but not obvious what to do about it

        # CH1/2 Watts
        self._calc_pe('ch1', ds, ret, prev)
        self._calc_pe('ch2', ds, ret, prev)

        return ret

    def printPacket(self, p):
        ts = fmttime(time.localtime(p['time_created']))

        print ts+": Serial: %s" % p['serial']
        print ts+": Counter: %d" % self._getresetcounter(p['flag'])
        print ts+": Voltage:           %9.2fV" % p['volts']
        for x in range(1,self.NUM_CHAN+1):
            print ts+": Ch%d Current:        %9.2fA" % (x, p['ch%d_a' % x])
        for x in range(1,self.NUM_CHAN+1):
            print ts+": Ch%d Watts:          % 13.6fKWh (% 5dW)" % (x, p['ch%d_wh' % x]/1000, p['ch%d_w' % x])
            print ts+": Ch%d Positive Watts: % 13.6fKWh (% 5dW)" % (x, p['ch%d_pwh' % x]/1000, p['ch%d_pw' % x])
            print ts+": Ch%d Negative Watts: % 13.6fKWh (% 5dW)" % (x, p['ch%d_nwh' % x]/1000, p['ch%d_nw' % x])


class ECM1240BinaryPacket(ECM1220BinaryPacket):
    def __init__(self):
        ECM1220BinaryPacket.__init__(self)
        self.PACKET_ID         = 3
        self.DATA_BYTES_LENGTH = 59  # does not include the start/end headers
        self.NUM_CHAN = 2
        self.NUM_AUX = 5

    def channels(self, filter):
        c = []
        if filter == FILTER_PE_LABELS:
            c = ['ch1', 'ch2', 'aux1', 'aux2', 'aux3', 'aux4', 'aux5']
        elif filter == FILTER_POWER:
            c = ['ch1_w', 'ch2_w', 'aux1_w', 'aux2_w', 'aux3_w', 'aux4_w', 'aux5_w']
        elif filter == FILTER_ENERGY:
            c = ['ch1_wh', 'ch2_wh', 'aux1_wh', 'aux2_wh', 'aux3_wh', 'aux4_wh', 'aux5_wh']
        elif filter == FILTER_DB_SCHEMA_ECMREAD:
            c = ['volts', 'ch1_a', 'ch2_a', 'ch1_w', 'ch2_w', 'aux1_w', 'aux2_w', 'aux3_w', 'aux4_w', 'aux5_w']
        elif filter == FILTER_DB_SCHEMA_ECMREAD_EXTENDED:
            c = ['volts', 'ch1_a', 'ch2_a', 'ch1_w', 'ch2_w', 'aux1_w', 'aux2_w', 'aux3_w', 'aux4_w', 'aux5_w', 'ch1_wh', 'ch2_wh', 'aux1_wh', 'aux2_wh', 'aux3_wh', 'aux4_wh', 'aux5_wh', 'ch1_whd', 'ch2_whd', 'aux1_whd', 'aux2_whd', 'aux3_whd', 'aux4_whd', 'aux5_whd', 'ch1_pw', 'ch1_nw', 'ch2_pw', 'ch2_nw', 'ch1_pwh', 'ch1_nwh', 'ch2_pwh', 'ch2_nwh']
        elif filter == FILTER_DB_SCHEMA_COUNTERS:
            c = ['volts', 'ch1_a', 'ch2_a', 'ch1_aws', 'ch2_aws', 'aux1_aws', 'aux2_aws', 'aux3_aws', 'aux4_aws', 'aux5_aws', 'ch1_pws', 'ch2_pws', 'aux1_pws', 'aux2_pws', 'aux3_pws', 'aux4_pws', 'aux5_pws', 'aux5_volts']
        return c

    def compile(self, rpkt):
        cpkt = ECM1220BinaryPacket.compile(self, rpkt)

        # AUX1-5 Watt-Second Counter (4 bytes each)
        cpkt['aux1_ws'] = self._convert(rpkt[37:41])
        cpkt['aux2_ws'] = self._convert(rpkt[41:45])
        cpkt['aux3_ws'] = self._convert(rpkt[45:49])
        cpkt['aux4_ws'] = self._convert(rpkt[49:53])
        cpkt['aux5_ws'] = self._convert(rpkt[53:57])

        # DC voltage on AUX5 (2 bytes)
        cpkt['aux5_volts'] = self._convert(rpkt[57:59])

        return cpkt

    def calculate(self, now, prev):
        ret = ECM1220BinaryPacket.calculate(self, now, prev)

        ds = float(ret['secs'] - prev['secs']) # overflow already handled

        # AUX1-5 Watts
        ret['aux1_w'] = (ret['aux1_ws'] - prev['aux1_ws']) / ds
        ret['aux2_w'] = (ret['aux2_ws'] - prev['aux2_ws']) / ds
        ret['aux3_w'] = (ret['aux3_ws'] - prev['aux3_ws']) / ds
        ret['aux4_w'] = (ret['aux4_ws'] - prev['aux4_ws']) / ds
        ret['aux5_w'] = (ret['aux5_ws'] - prev['aux5_ws']) / ds

        # AUX1-5 Watt-hours
        ret['aux1_wh'] = ret['aux1_ws'] / SpH
        ret['aux2_wh'] = ret['aux2_ws'] / SpH
        ret['aux3_wh'] = ret['aux3_ws'] / SpH
        ret['aux4_wh'] = ret['aux4_ws'] / SpH
        ret['aux5_wh'] = ret['aux5_ws'] / SpH

        # AUX1-5 Watt-hour delta
        ret['aux1_dwh'] = ret['aux1_wh'] - prev['aux1_ws'] / SpH
        ret['aux2_dwh'] = ret['aux2_wh'] - prev['aux2_ws'] / SpH
        ret['aux3_dwh'] = ret['aux3_wh'] - prev['aux3_ws'] / SpH
        ret['aux4_dwh'] = ret['aux4_wh'] - prev['aux4_ws'] / SpH
        ret['aux5_dwh'] = ret['aux5_wh'] - prev['aux5_ws'] / SpH

        return ret

    def printPacket(self, p):
        ts = fmttime(time.localtime(p['time_created']))

        print ts+": Serial: %s" % p['serial']
        print ts+": Counter: %d" % self._getresetcounter(p['flag'])
        print ts+": Voltage:            %9.2fV" % p['volts']
        for x in range(1,self.NUM_CHAN+1):
            print ts+": Ch%d Current:        %9.2fA" % (x, p['ch%d_a' % x])
        for x in range(1,self.NUM_CHAN+1):
            print ts+": Ch%d Watts:          % 13.6fKWh (% 5dW)" % (x, p['ch%d_wh' % x]/1000, p['ch%d_w' % x])
            print ts+": Ch%d Positive Watts: % 13.6fKWh (% 5dW)" % (x, p['ch%d_pwh' % x]/1000, p['ch%d_pw' % x])
            print ts+": Ch%d Negative Watts: % 13.6fKWh (% 5dW)" % (x, p['ch%d_nwh' % x]/1000, p['ch%d_nw' % x])
        for x in range(1,self.NUM_AUX+1):
            print ts+": Aux%d Watts:         % 13.6fKWh (% 5dW)" % (x, p['aux%d_wh' % x]/1000, p['aux%d_w' % x])


# GEM binary packet with 48 channels, polarization
class GEM48PBinaryPacket(BasePacket):
    def __init__(self):
        BasePacket.__init__(self)
        self.PACKET_ID = 5
        self.DATA_BYTES_LENGTH = 613 # does not include the start/end headers
        self.NUM_CHAN = 48
        self.NUM_SENSE = 8
        self.NUM_PULSE = 4
        self.NUM_DISPLAY_CHAN = 32

    def _serialraw(self, packet):
        sn1 = ord(packet[481:482])
        sn2 = ord(packet[482:483]) * 256
        id1 = ord(packet[485:486])
        return self._fmtserial(id1, sn1+sn2)

    def _fmtserial(self, id, sn):
        '''GEM serial numbers are 8 characters - unit id then serial'''
        return "%03d%05d" % (id, sn)

    def channels(self, filter):
        c = []
        if filter == FILTER_PE_LABELS:
            for x in range(1,self.NUM_CHAN+1):
                c.append('ch%0d' % x)
        elif filter == FILTER_POWER:
            for x in range(1,self.NUM_CHAN+1):
                c.append('ch%0d_w' % x)
        elif filter == FILTER_ENERGY:
            for x in range(1,self.NUM_CHAN+1):
                c.append('ch%0d_wh' % x)
        elif filter == FILTER_PULSE:
            for x in range(1,self.NUM_PULSE+1):
                c.append('p%d' % x)
        elif filter == FILTER_SENSOR:
            for x in range(1,self.NUM_SENSE+1):
                c.append('t%d' % x)
        elif fmt == FILTER_DB_SCHEMA_COUNTERS:
            c = ['volts']
            for x in range(1,self.NUM_CHAN+1):
                c.append('ch%0d_aws' % x)
                c.append('ch%0d_pws' % x)
            for x in range(1,self.NUM_PULSE+1):
                c.append('p%d' % x)
            for x in range(1,self.NUM_SENSE+1):
                c.append('t%d' % x)
        return c

    def compile(self, rpkt):
        cpkt = {}

        # Voltage Data (2 bytes)
        cpkt['volts'] = 0.1 * self._convert(rpkt[1::-1])

        # Absolute/Polarized Watt-Second Counters (5 bytes each)
        for x in range(1,self.NUM_CHAN+1):
            cpkt['ch%d_aws' % x] = self._convert(rpkt[2+5*(x-1):2+5*x])
            cpkt['ch%d_pws' % x] = self._convert(rpkt[242+5*(x-1):242+5*x])

        # Device Serial Number (2 bytes)
        cpkt['ser_no'] = self._convert(rpkt[483:481:-1])

        # Reserved (1 byte)

        # Device Information (1 byte)
        cpkt['unit_id'] = self._convert(rpkt[485:486])

        # Reserved (96 bytes)

        # Seconds (3 bytes)
        cpkt['secs'] = self._convert(rpkt[582:585])

        # Pulse Counters (3 bytes each)
        for x in range(1,self.NUM_PULSE+1):
            cpkt['p%d' % x] = self._convert(rpkt[585+3*(x-1):585+3*x])

        # One-Wire Sensors (2 bytes each)
        for x in range(1,self.NUM_SENSE+1):
            cpkt['t%d' % x] = self._convert(rpkt[597+2*(x-1):597+2*x])

        # Spare (2 bytes)

        # Add the current time as the timestamp
        cpkt['time_created'] = getgmtime()

        # Add a formatted serial number
        cpkt['serial'] = self._getserial(cpkt)

        return cpkt

    def calculate(self, now, prev):
        ret = now

        # handle seconds counter overflow
        if ret['secs'] < prev['secs']:
            ret['secs'] += self.SEC_COUNTER_MAX
        ds = float(ret['secs'] - prev['secs'])

        # attempt to detect a counter reset or overflow
        for x in range(1,self.NUM_CHAN+1):
            tag = 'ch%d' % x
            c0 = prev[tag+'_aws']
            c1 = ret[tag+'_aws']
            if c1 < c0:
                raise CounterResetError("old: %d new: %d" % (c0, c1))

        for x in range(1,self.NUM_CHAN+1):
            tag = 'ch%d' % x
            self._calc_pe(tag, ds, ret, prev)

        return ret

    def printPacket(self, p):
        ts = fmttime(time.localtime(p['time_created']))

        print ts+": Serial: %s" % p['serial']
        print ts+": Voltage: % 6.2fV" % p['volts']
        for x in range(1,self.NUM_DISPLAY_CHAN+1):
            print ts+": Ch%02d: % 13.6fKWh (% 5dW)" % (x, p['ch%d_wh' % x]/1000, p['ch%d_w' % x])
        for x in range(1,self.NUM_PULSE+1):
            print ts+": p%d: % 15d" % (x, p['p%d' % x])
        for x in range(1,self.NUM_SENSE+1):
            print ts+": t%d: % 15.6f" % (x, p['t%d' % x])


# GEM binary packet with 48 channels, polarization, time stamp
class GEM48PTBinaryPacket(GEM48PBinaryPacket):
    def __init__(self):
        GEM48PBinaryPacket.__init__(self)
        self.PACKET_ID = 5
        self.DATA_BYTES_LENGTH = 619 # does not include the start/end headers
        self.USE_PACKET_TIMESTAMP = 1 # should we trust the GEM clock?

    def compile(self, rpkt):
        cpkt = GEM48PBinaryPacket.compile(self, rpkt)

        # Time Stamp (1 byte each)
        cpkt['year'] = self._convert(rpkt[613:614])
        cpkt['mth']  = self._convert(rpkt[614:615])
        cpkt['day']  = self._convert(rpkt[615:616])
        cpkt['hr']   = self._convert(rpkt[616:617])
        cpkt['min']  = self._convert(rpkt[617:618])
        cpkt['sec']  = self._convert(rpkt[618:619])

        # Add the timestamp as epoch
        if self.USE_PACKET_TIMESTAMP:
            tstr = '20%d.%d.%d %d:%d:%d' % (cpkt['year'],cpkt['mth'],cpkt['day'],cpkt['hr'],cpkt['min'],cpkt['sec'])
            cpkt['time_created'] = int(time.mktime(time.strptime(tstr, '%Y.%m.%d %H:%M:%S')))

        return cpkt


# Data Collector classes

class BaseDataCollector(object):
    def __init__(self, packet_processor):
        self.packet_processor = packet_processor
        dbgmsg('packet format is %s' % PACKET_FORMAT.__class__.__name__)
        dbgmsg('collector is %s' % self.__class__.__name__)
        dbgmsg('using %d processors:' % len(self.packet_processor))
        for p in self.packet_processor:
            dbgmsg('  %s' % p.__class__.__name__)

    def setup(self):
        pass

    def cleanup(self):
        pass

    # The read method collects data then passes it to each of the processors.
    def read(self):
        pass

    def process(self):
        pass

    # Loop forever, break only for keyboard interrupts.
    def run(self):
        try:
            self.setup()
            for p in self.packet_processor:
                dbgmsg('setup %s' % p.__class__.__name__)
                p.setup()

            while True:
                try:
                    self.read()
                    self.process()
                except ReadError, e:
                    dbgmsg('read failed: %s' % e.msg)
                except KeyboardInterrupt, e:
                    raise e
                except Exception, e:
                    errmsg(e)

        except KeyboardInterrupt:
            sys.exit(0)
        except Exception, e:
            if LOGLEVEL >= LOG_DEBUG:
                traceback.print_exc()
            else:
                errmsg(e)
            sys.exit(1)

        finally:
            for p in self.packet_processor:
                dbgmsg('cleanup %s' % p.__class__.__name__)
                p.cleanup()
            self.cleanup()


class BufferedDataCollector(BaseDataCollector):
    def __init__(self, packet_processor):
        super(BufferedDataCollector, self).__init__(packet_processor)
        self.packet_buffer = CompoundBuffer(DEFAULT_BUFFER_SIZE)

    def _compile(self, packet):
        return PACKET_FORMAT.compile(packet)

    # Read the indicated number of bytes.  This should be overridden by derived
    # classes to do the actual reading of bytes.
    def readbytes(self, count):
        return ''

    def read(self):
        packets = PACKET_FORMAT.read(self)
        for p in packets:
            p = PACKET_FORMAT.compile(p)
            self.packet_buffer.insert(p['time_created'], p)

    def process(self):
        dbgmsg('buffer info:')
        for sn in self.packet_buffer.getkeys():
            dbgmsg('  %s: %d of %d' % (sn, self.packet_buffer.size(sn), self.packet_buffer.maxsize))
        for p in self.packet_processor:
            try:
                dbgmsg('processing with %s' % p.__class__.__name__)
                p.process_compiled(self.packet_buffer)
            except Exception, e:
                if not p.handle(e):
                    wrnmsg('Exception in %s: %s' % (p.__class__.__name__, e))
                    if LOGLEVEL >= LOG_DEBUG:
                        traceback.print_exc()


class SerialCollector(BufferedDataCollector):
    def __init__(self, packet_processor, port=SERIAL_PORT, rate=SERIAL_BAUD):
        super(SerialCollector, self).__init__(packet_processor)

        if not serial:
            print 'Error: serial module could not be imported.'
            sys.exit(1)

        self._port = port
        self._baudrate = int(rate)
        self._conn = None

        infmsg('serial port: %s' % self._port)
        infmsg('baud rate: %d' % self._baudrate)

    def readbytes(self, count):
        return self._conn.read(count)

    def setup(self):
        self._conn = serial.Serial(self._port, self._baudrate)

    def cleanup(self):
        if self._conn:
            self._conn.close()
            self._conn = None


class SocketServerCollector(BufferedDataCollector):
    def __init__(self, packet_processor, host=IP_HOST, port=IP_PORT):
        super(SocketServerCollector, self).__init__(packet_processor)
        socket.setdefaulttimeout(IP_TIMEOUT)
        self._host = host
        self._port = int(port)
        self._sock = None
        self._conn = None
        infmsg('host: %s' % self._host)
        infmsg('port: %d' % self._port)
        infmsg('timeout: %d' % IP_TIMEOUT)

    def readbytes(self, count):
        return self._conn.recv(count)

    def read(self):
        try:
            dbgmsg('waiting for connection')
            self._conn, addr = self._sock.accept()
            super(SocketServerCollector, self).read();
        finally:
            if self._conn:
                self._conn.shutdown(socket.SHUT_RD)
                self._conn.close()
                self._conn = None

    def setup(self):
        self._sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
        self._sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
        try:
            self._sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)
        except: # REUSEPORT may not be supported on all systems  
            pass
        self._sock.bind((self._host, self._port))
        self._sock.listen(1)

    def cleanup(self):
        if self._sock:
            self._sock.close()
            self._sock = None


class SocketClientCollector(BufferedDataCollector):
    def __init__(self, packet_processor, host=IP_HOST, port=IP_PORT):
        super(SocketClientCollector, self).__init__(packet_processor)
        socket.setdefaulttimeout(IP_TIMEOUT)
        self._host = host
        self._port = int(port)
        self._sock = None
        infmsg('host: %s' % self._host)
        infmsg('port: %d' % self._port)
        infmsg('timeout: %d' % IP_TIMEOUT)

    def readbytes(self, count):
        return self._sock.recv(count)

    def read(self):
        try:
            self._sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
            self._sock.connect((self._host, self._port))
            super(SocketClientCollector, self).read();
        finally:
            if self._sock:
                self._sock.close()
                self._sock = None


class DatabaseCollector(BaseDataCollector):
    def __init__(self, packet_processor, table=DB_TABLE):
        super(DatabaseCollector, self).__init__(packet_processor)
        self._conn = None
        self._table = table
        self._poll_interval = DB_POLL_INTERVAL
        self._lastread = getgmtime() - self._poll_interval
        infmsg('DB: polling interval: %d seconds' % self._poll_interval)
        infmsg('DB: table: %s' % self._table)

    def read(self):
        cursor = self._conn.cursor()
        cursor.execute('select * from ' + self._table + ' where time_created > ' + str(self._lastread))
        # FIXME: limit number of items that we will accept from db query
        rows = cursor.fetchall()
        dbgmsg('DB: query returned %d rows' % len(rows))
        self._lastread = getgmtime()
        packets = {}
        for row in rows:
            sn = str(row[2]) # FIXME: get this by name, not index
            if not sn in packets:
                packets[sn] = []
            packets[sn].append(self.row2packet(row))
        cursor.close()
        for sn in packets:
            self.process(sn, packets[sn])
        time.sleep(self._poll_interval)

    # process a list of calculated packets
    def process(self, packets):
        packets.sort(compare_packet_times)
        for p in self.packet_processor:
            try:
                dbgmsg('processing with %s' % p.__class__.__name__)
                p.process_calculated(packets)
            except Exception, e:
                if not p.handle(e):
                    wrnmsg('Exception in %s: %s' % (p.__class__.__name__, e))
                    if LOGLEVEL >= LOG_DEBUG:
                        traceback.print_exc()

    # FIXME: infer the schema automatically from the database columns
    def row2packet(self, row):
        sn = str(row[2])
        p = {}
        p['flag'] = 0 # fake it
        p['unit_id'] = int(sn[0])
        p['ser_no'] = int(sn[1:])
        p['time_created'] = long(row[1])
        p['volts'] = float(row[3])
        p['ch1_amps'] = float(row[4])
        p['ch2_amps'] = float(row[5])
        p['ch1_w'] = int(row[6])
        p['ch2_w'] = int(row[7])
        p['aux1_w'] = int(row[8])
        p['aux2_w'] = int(row[9])
        p['aux3_w'] = int(row[10])
        p['aux4_w'] = int(row[11])
        p['aux5_w'] = int(row[12])
        if DB_SCHEMA == 'extended':
            p['ch1_wh'] = int(row[13])
            p['ch2_wh'] = int(row[14])
            p['aux1_wh'] = int(row[15])
            p['aux2_wh'] = int(row[16])
            p['aux3_wh'] = int(row[17])
            p['aux4_wh'] = int(row[18])
            p['aux5_wh'] = int(row[19])
            p['ch1_pw'] = int(row[20])
            p['ch1_nw'] = int(row[21])
            p['ch2_pw'] = int(row[22])
            p['ch2_nw'] = int(row[23])
            p['ch1_pwh'] = int(row[24])
            p['ch1_nwh'] = int(row[25])
            p['ch2_pwh'] = int(row[26])
            p['ch2_nwh'] = int(row[27])
            p['ch1_dwh'] = int(row[28])
            p['ch2_dwh'] = int(row[29])
            p['aux1_dwh'] = int(row[30])
            p['aux2_dwh'] = int(row[31])
            p['aux3_dwh'] = int(row[32])
            p['aux4_dwh'] = int(row[33])
            p['aux5_dwh'] = int(row[34])
        else:
            p['ch1_wh'] = 0
            p['ch2_wh'] = 0
            p['aux1_wh'] = 0
            p['aux2_wh'] = 0
            p['aux3_wh'] = 0
            p['aux4_wh'] = 0
            p['aux5_wh'] = 0
            p['ch1_pw'] = 0
            p['ch1_nw'] = 0
            p['ch2_pw'] = 0
            p['ch2_nw'] = 0
            p['ch1_pwh'] = 0
            p['ch1_nwh'] = 0
            p['ch2_pwh'] = 0
            p['ch2_nwh'] = 0
            p['ch1_dwh'] = 0
            p['ch2_dwh'] = 0
            p['aux1_dwh'] = 0
            p['aux2_dwh'] = 0
            p['aux3_dwh'] = 0
            p['aux4_dwh'] = 0
            p['aux5_dwh'] = 0
        return p


class MySQLCollector(DatabaseCollector):
    def __init__(self, packet_processor, host, database, tbl, user, password):
        if not MySQLdb:
            print 'DB Error: MySQLdb module could not be imported.'
            sys.exit(1)

        super(MySQLCollector, self).__init__(packet_processor,database+'.'+tbl)
        self._host     = host
        self._database = database
        self._user     = user
        self._passwd   = password
        infmsg('DB: host: %s' % self._host)
        infmsg('DB: database: %s' % self._database)
        infmsg('DB: username: %s' % self._user)

    def setup(self):
        super(MySQLCollector, self).setup()
        self._conn = MySQLdb.connect(host=self._host,
                                     user=self._user,
                                     passwd=self._passwd,
                                     db=self._database)

    def cleanup(self):
        if self._conn:
            self._conn.close()
            self._conn = None

    def handle(self, e):
        if type(e) == MySQLdb.Error:
            errmsg('MySQL Error: [#%d] %s' % (e.args[0], e.args[1]))
            return True
        return super(MySQLCollector, self).handle(e)


class SqliteCollector(DatabaseCollector):
    def __init__(self, packet_processor, table, filename):
        if not sqlite:
            print 'DB Error: sqlite3 module could not be imported.'
            sys.exit(1)
        if not filename:
            print 'DB Error: no database file specified'
            sys.exit(1)

        super(SqliteCollector, self).__init__(packet_processor, table)
        self._file = filename
        infmsg('DB: file: %s' % self._file)

    def setup(self):
        super(SqliteCollector, self).setup()
        self._conn = sqlite.connect(self._file)

    def cleanup(self):
        if self._conn:
            self._conn.commit()
            self._conn.close()


# Buffer Classes

class MovingBuffer(object):
    '''Maintain fixed-size buffer of data.  Oldest packets are removed.'''
    def __init__(self, maxsize):
        self.maxsize = maxsize
        self.packets = []

    def insert(self, timestamp, packet):
        dbgmsg('buffering packet ts:%d sn:%s' % (timestamp, packet['serial']))
        bisect.insort(self.packets, (timestamp, packet))
        if len(self.packets) > self.maxsize:
            del(self.packets[0])

    def newest(self, timestamp):
        '''return all packets with timestamp newer than specified timestamp'''
        idx = bisect.bisect(self.packets, (timestamp, {}))
        return self.packets[idx:]

    def oldest(self):
        '''return the oldest packet in the buffer'''
        return self.packets[0]

    def size(self):
        return len(self.packets)

class CompoundBuffer(object):
    '''Variable number of moving buffers, each associated with an ID'''
    def __init__(self, maxsize):
        self.maxsize = maxsize
        self.buffers = {}
        dbgmsg('buffer size: %d' % self.maxsize)

    def insert(self, timestamp, packet):
        return self.getbuffer(packet['serial']).insert(timestamp, packet)

    def newest(self, ecm_serial, timestamp):
        return self.getbuffer(ecm_serial).newest(timestamp)

    def oldest(self, ecm_serial):
        return self.getbuffer(ecm_serial).oldest()

    def size(self, ecm_serial):
        return self.getbuffer(ecm_serial).size()

    def getbuffer(self, ecm_serial):
        if not ecm_serial in self.buffers:
            dbgmsg('adding buffer for %s' % ecm_serial)
            self.buffers[ecm_serial] = MovingBuffer(self.maxsize)
        return self.buffers[ecm_serial]

    def getkeys(self):
        return self.buffers.keys()

# Packet Processor Classes

class BaseProcessor(object):
    def __init__(self, *args, **kwargs):
        self.last_processed = {}
        self.process_period = 1 # in seconds

    def setup(self):
        pass
		
    def process_compiled(self, packet_buffer):
        now = getgmtime()
        for sn in packet_buffer.getkeys():
            if packet_buffer.size(sn) < 1:
                dbgmsg('buffer is empty for %s' % sn)
                continue
            if not sn in self.last_processed or now >= self.last_processed[sn] + self.process_period:
                if not sn in self.last_processed:
                    ts = packet_buffer.oldest(sn)[0]
                else:
                    ts = self.last_processed[sn]
                data = packet_buffer.newest(sn, ts)
                if len(data) > 1:
                    dbgmsg('%d buffered packets' % len(data))
                    packets = []
                    for a,b in zip(data[0:], data[1:]):
                        try:
                            self.last_processed[sn] = b[0]
                            packets.append(PACKET_FORMAT.calculate(b[1],a[1]))
                        except ZeroDivisionError, zde:
                            infmsg("not enough data in buffer for %s" % sn)
                        except CounterResetError, cre:
                            wrnmsg("counter reset for %s: %s" % (sn, cre.msg))
                    dbgmsg('%d calculated packets' % len(packets))
                    self.process_calculated(packets)
                else:
                    dbgmsg('not enough data for %s' % sn)
                    continue
            else:
                dbgmsg('waiting to process packets for %s' % sn)

    def process_calculated(self, packets):
        pass

    def handle(self, exception):
        return False

    def cleanup(self):
        pass


class PrintProcessor(BaseProcessor):
    def __init__(self, *args, **kwargs):
        super(PrintProcessor, self).__init__(*args, **kwargs)

    def process_calculated(self, packets):
        for p in packets:
            print
            PACKET_FORMAT.printPacket(p)


class DatabaseProcessor(BaseProcessor):
    def __init__(self, *args, **kwargs):
        super(DatabaseProcessor, self).__init__(*args, **kwargs)
        self.db_table = DB_TABLE
        self.process_period = DB_INSERT_PERIOD
        self.conn = None

    def setup(self):
        pass

    def cleanup(self):
        pass

    def process_calculated(self, packets):
        for p in packets:
            labels = ['time_created', 'serial']
            values = [str(p['time_created']), p['serial']]
            for c in PACKET_FORMAT.channels(DEFAULT_DB_SCHEMA):
                labels.append(c)
                values.append(str(p[c]))
            sql = []
            sql.append('INSERT INTO %s (' % self.db_table)
            sql.append(','.join(labels))
            sql.append(') VALUES (')
            sql.append(','.join(values))
            sql.append(')')
            dbgmsg('DB: query: %s' % ''.join(sql))
            cursor = self.conn.cursor()
            cursor.execute(''.join(sql))
            cursor.close()
            infmsg('DB: inserted %d values for %s at %s' % (len(values), p['serial'], p['time_created']))
        self.conn.commit()


class MySQLClient(object):
    def __init__(self, *args, **kwargs):
        if not MySQLdb:
            print 'DB Error: MySQLdb module could not be imported.'
            sys.exit(1)

        self.db_host     = kwargs.get('mysql_host')     or DB_HOST
        self.db_user     = kwargs.get('mysql_user')     or DB_USER
        self.db_passwd   = kwargs.get('mysql_passwd')   or DB_PASSWD
        self.db_database = kwargs.get('mysql_database') or DB_DATABASE
        self.db_table    = kwargs.get('mysql_table')    or DB_TABLE
        self.db_table    = self.db_database + '.' + self.db_table

        infmsg('DB: host: %s' % self.db_host)
        infmsg('DB: database: %s' % self.db_database)
        infmsg('DB: table: %s' % self.db_table)
        infmsg('DB: username: %s' % self.db_user)

    def setup(self):
        self.conn = MySQLdb.connect(host=self.db_host,
                                    user=self.db_user,
                                    passwd=self.db_passwd,
                                    db=self.db_database)

    def cleanup(self):
        if self.conn:
            self.conn.commit()
            self.conn.close()
            self.conn = None


class MySQLProcessor(DatabaseProcessor, MySQLClient):
    def __init__(self, *args, **kwargs):
        super(MySQLProcessor, self).__init__(*args, **kwargs)

    def handle(self, e):
        if type(e) == MySQLdb.Error:
            errmsg('MySQL Error: [#%d] %s' % (e.args[0], e.args[1]))
            return True
        return super(MySQLProcessor, self).handle(e)

    def setup(self):
        super(MySQLProcessor, self).setup()

    def cleanup(self):
        super(MySQLProcessor, self).cleanup()


class MySQLConfigurator(MySQLClient):
    def __init__(self, *args, **kwargs):
        super(MySQLConfigurator, self).__init__(*args, **kwargs)

    def configure(self):
        try:
            self.setup()

            sql = 'create database %d' % self.db_database
            cursor = self.conn.cursor()
            cursor.execute(sql)
            cursor.close()

            sql = "grant usage on *.* to %s@%s identified by '%s'" % (
                self.db_user, self.db_host, self.db_passwd)
            cursor = self.conn.cursor()
            cursor.execute(sql)
            cursor.close()

            sql = "grant all prvileges on %s.* to $s@%s" % (
                self.db_database, self.db_user, self.db_host)
            cursor = self.conn.cursor()
            cursor.execute(sql)
            cursor.close()

            sql = []
            sql.append('create table %s (id int primary key auto_increment')
            sql.append(', time_created int')
            sql.append(', serial int')
            for c in PACKET_FORMAT.channels(DEFAULT_DB_SCHEMA):
                sql.append(', %s float' % c)
            sql.append(')')
            cursor = self.conn.cursor()
            cursor.execute(''.join(sql))
            cursor.close()

        finally:
            self.cleanup()


class SqliteClient(object):
    def __init__(self, *args, **kwargs):
        if not sqlite:
            print 'DB Error: sqlite3 module could not be imported.'
            sys.exit(1)

        self.db_table = kwargs.get('sqlite_table') or DB_TABLE
        self.db_file  = kwargs.get('sqlite_file')  or DB_FILENAME
        if not (self.db_file):
            print 'DB Error: no database file specified'
            sys.exit(1)

        infmsg('DB: file: %s' % self.db_file)
        infmsg('DB: table: %s' % self.db_table)

    def setup(self):
        self.conn = sqlite.connect(self.db_file)

    def cleanup(self):
        if self.conn:
            self.conn.commit()
            self.conn.close()
            self.conn = None


class SqliteProcessor(DatabaseProcessor, SqliteClient):
    def __init__(self, *args, **kwargs):
        DatabaseProcessor.__init__(self, *args, **kwargs)
        SqliteClient.__init__(self, *args, **kwargs)

    def setup(self):
        DatabaseProcessor.setup(self)
        SqliteClient.setup(self)

    def cleanup(self):
        DatabaseProcessor.cleanup(self)
        SqliteClient.cleanup(self)


class SqliteConfigurator(SqliteClient):
    def __init__(self, *args, **kwargs):
        super(SqliteConfigurator, self).__init__(*args, **kwargs)

    def configure(self):
        try:
            self.setup()

            infmsg('creating table %s' % self.db_table)
            sql = []
            sql.append('create table %s (id int primary key' % self.db_table)
            sql.append(', time_created int')
            sql.append(', serial int')
            for c in PACKET_FORMAT.channels(DEFAULT_DB_SCHEMA):
                sql.append(', %s float' % c)
            sql.append(')')
            cursor = self.conn.cursor()
            cursor.execute(''.join(sql))
            cursor.close()
        finally:
            self.cleanup()


class RRDProcessor(BaseProcessor):
    def __init__(self, *args, **kwargs):
        super(RRDProcessor, self).__init__(*args, **kwargs)
        if not rrdtool:
            print 'RRD Error: rrdtool module could not be imported.'
            sys.exit(1)

        self._dir = kwargs.get('rrd_dir') or RRD_DIR
        self._step = kwargs.get('rrd_step') or RRD_STEP
        self.process_period = self._step
        self._heartbeat = kwargs.get('rrd_heartbeat') or RRD_HEARTBEAT
        if not (self._dir):
            print 'RRD Error: no directory specified for rrd files'
            sys.exit(1)

        infmsg('RRD: dir: %s' % self._dir)
        infmsg('RRD: step: %s' % self._step)
        infmsg('RRD: heartbeat: %s' % self._heartbeat)

    def _mkfn(self, label):
        return '%s/%s.rrd' % (self._dir, label)

    def _mkdir(self):
        if not os.path.exists(self._dir):
            infmsg('RRD: creating rrd directory %s' % self._dir)
            os.makedirs(self._dir)

    def _mklabel(self, packet, channel):
        return '%s_%s' % (packet['serial'], channel)

    def _rrdexists(self, packet, channel):
        return os.path.exists(self._mkfn(self._mklabel(packet, channel)))

    # dstype is one of COUNTER, GAUGE, DERIVE, ABSOLUTE, COMPUTE
    def create_rrd(self, packet, channel, dstype):
        self._mkdir()
        ts = packet['time_created'] - 1
        step = RRD_STEPS
        reso = RRD_RESOLUTIONS
        label = self._mklabel(packet, channel)
        fn = self._mkfn(label)
        infmsg('RRD: creating rrd file %s' % fn)
        rc = rrdtool.create(fn,
                            '--step', str(self._step),
                            '--start', str(ts),
                            [ "DS:%s:%s:%d:U:U" % (channel,dstype,self._heartbeat) ],
                            "RRA:AVERAGE:0.5:%d:%d" % (step[0], reso[0]),
                            "RRA:AVERAGE:0.5:%d:%d" % (step[1], reso[1]),
                            "RRA:AVERAGE:0.5:%d:%d" % (step[2], reso[2]),
                            "RRA:AVERAGE:0.5:%d:%d" % (step[3], reso[3]),
                            "RRA:MAX:0.5:%d:%d" % (step[0], reso[0]),
                            "RRA:MAX:0.5:%d:%d" % (step[1], reso[1]),
                            "RRA:MAX:0.5:%d:%d" % (step[2], reso[2]),
                            "RRA:MAX:0.5:%d:%d" % (step[3], reso[3]),
                            "RRA:MIN:0.5:%d:%d" % (step[0], reso[0]),
                            "RRA:MIN:0.5:%d:%d" % (step[1], reso[1]),
                            "RRA:MIN:0.5:%d:%d" % (step[2], reso[2]),
                            "RRA:MIN:0.5:%d:%d" % (step[3], reso[3]))
        if rc:
            wrnmsg("failed to create '%s': %d" % (fn, rc))

    def update_rrd(self, packet, channel, value, dstype):
        label = self._mklabel(packet, channel)
        fn = self._mkfn(label)
        if dstype == 'GAUGE':
            rc = rrdtool.update(fn, '%d:%f' % (packet['time_created'], value));
        else:
            rc = rrdtool.update(fn, '%d:%d' % (packet['time_created'], value));
        if rc:
            wrnmsg("failed to update '%s': %d" % (fn, rc))

    def update_files(self, packet):
        dbgmsg('RRD: updating sn:%s ts:%d' % (packet['serial'], packet['time_created']))
        for x in PACKET_FORMAT.channels(FILTER_POWER):
            if not self._rrdexists(packet, x):
                self.create_rrd(packet, x, 'GAUGE')
            self.update_rrd(packet, x, packet[x], 'GAUGE')
        for x in PACKET_FORMAT.channels(FILTER_ENERGY):
            if not self._rrdexists(packet, x):
                self.create_rrd(packet, x, 'DERIVE')
            self.update_rrd(packet, x, packet[x], 'DERIVE')
        for x in PACKET_FORMAT.channels(FILTER_PULSE):
            if not self._rrdexists(packet, x):
                self.create_rrd(packet, x, 'DERIVE')
            self.update_rrd(packet, x, packet[x], 'DERIVE')
        for x in PACKET_FORMAT.channels(FILTER_SENSOR):
            if not self._rrdexists(packet, x):
                self.create_rrd(packet, x, 'GAUGE')
            self.update_rrd(packet, x, packet[x], 'GAUGE')

    def process_calculated(self, packets):
        self._process_latest(packets)

    def _process_latest(self, packets):
        # update using data from a single packet - the latest one
        if len(packets) > 0:
            self.update_files(packets[len(packets)-1])

    def _process_average(self, packets):
        # average data from multiple packets
        if len(packets) > 0:
            ave = {}
            for x in PACKET_FORMAT.channels(FILTER_POWER):
                ave[x] = 0
            for x in PACKET_FORMAT.channels(FILTER_SENSOR):
                ave[x] = 0
            for p in packets:
                for x in PACKET_FORMAT.channels(FILTER_POWER):
                    ave[x] += p[x]
                for x in PACKET_FORMAT.channels(FILTER_ENERGY):
                    ave[x] = p[x]
                for x in PACKET_FORMAT.channels(FILTER_PULSE):
                    ave[x] = p[x]
                for x in PACKET_FORMAT.channels(FILTER_SENSOR):
                    ave[x] += p[x]
            for x in PACKET_FORMAT.channels(FILTER_POWER):
                ave[x] /= len(packets)
            for x in PACKET_FORMAT.channels(FILTER_SENSOR):
                ave[x] /= len(packets)
            self.update_files(ave)

    def _process_all(self, packets):
        # process each packet - assumes device emits at same frequency as step
        for p in packets:
            self.update_files(p)

class UploadProcessor(BaseProcessor):
    class FakeResult(object):
        def geturl(self):
            return 'fake result url'
        def info(self):
            return 'fake result info'
        def read(self):
            return 'fake result read'

    def __init__(self, *args, **kwargs):
        self.process_period = DEFAULT_UPLOAD_PERIOD
        self.timeout = DEFAULT_UPLOAD_TIMEOUT
        self.urlopener = {}
        pass

    def setup(self):
        pass

    def process_calculated(self, packets):
        pass

    def handle(self, exception):
        return False

    def cleanup(self):
        pass

    def _create_request(self, url):
        req = urllib2.Request(url)
        req.add_header("User-Agent", "btmon/%s" % __version__)
        return req

    def _urlopen(self, url, data):
        try:
            req = self._create_request(url)
            dbgmsg('%s: url: %s\n  headers: %s\n  data: %s' %
                   (self.__class__.__name__, req.get_full_url(), req.headers, data))

            result = {}
            if SKIP_UPLOAD:
                result = UploadProcessor.FakeResult()
            elif self.urlopener:
                result = self.urlopener.open(req, data, self.timeout)
            else:
                result = urllib2.urlopen(req, data, self.timeout)

            infmsg('%s: %d bytes uploaded' %
                   (self.__class__.__name__, len(data)))
            dbgmsg('%s: url: %s\n  response: %s' %
                   (self.__class__.__name__, result.geturl(), result.info()))
            return result
        except urllib2.HTTPError, e:
            self._handle_urlopen_error(e, url, data)
            result = e.read()
            errmsg(e)
            errmsg(result)

    def _handle_urlopen_error(self, e, url, data):
        errmsg(''.join(['%s Error: %s' % (self.__class__.__name__, e),
                        '\n  URL:  ' + url,
                        '\n  data: ' + data,]))


class WattzOnProcessor(UploadProcessor):
    def __init__(self, *args, **kwargs):
        super(WattzOnProcessor, self).__init__(*args, **kwargs)
        self.api_key  = kwargs.get('wo_api_key') or WATTZON_API_KEY
        self.username = kwargs.get('wo_user')    or WATTZON_USER
        self.password = kwargs.get('wo_pass')    or WATTZON_PASS
        self.map_str  = kwargs.get('wo_map')     or WATTZON_MAP
        self.process_period = WATTZON_UPLOAD_PERIOD
        self.timeout = WATTZON_TIMEOUT

        infmsg('WO: upload period: %d' % self.process_period)
        infmsg('WO: url: %s' % WATTZON_API_URL)
        infmsg('WO: api key: %s' % self.api_key)
        infmsg('WO: username: %s' % self.username)
        infmsg('WO: map: %s' % self.map_str)

    def setup(self):
        if not (self.api_key and self.username and self.password and self.map_str):
            print 'WattzOn Error: Insufficient parameters'
            if not self.api_key:
                print '  No API key'
            if not self.username:
                print '  No username'
            if not self.password:
                print '  No passord'
            if not self.map_str:
                print '  No mapping between channels and WattzOn meters'
            sys.exit(1)

        self.map = pairs2dict(self.map_str)
        if not self.map:
            print 'WattzOn Error: cannot determine channel-meter map'
            sys.exit(1)

        p = urllib2.HTTPPasswordMgrWithDefaultRealm()
        p.add_password('WattzOn', WATTZON_API_URL, self.username,self.password)
        auth = urllib2.HTTPBasicAuthHandler(p)
        self.urlopener = urllib2.build_opener(auth)

    def process_calculated(self, packets):
        for p in packets:
            for c in PACKET_FORMAT.channels(FILTER_PE_LABELS):
                key = p['serial'] + '_' + c
                if key in self.map:
                    meter = self.map[key]
                    ts = time.strftime('%Y-%m-%dT%H:%M:%SZ',
                                       time.gmtime(p['time_created']))
                    result = self._make_call(meter, ts, p[c+'_w'])
                    infmsg('WattzOn: %s [%s] magnitude: %s' %
                           (ts, meter, p[c+'_w']))
                    dbgmsg('WattzOn: %s' % result.info())

    def handle(self, e):
        if type(e) == urllib2.HTTPError:
            errmsg(''.join(['HTTPError:  ' + str(e),
                            '\n  URL:      ' + e.geturl(),
                            '\n  username: ' + self.username,
                            '\n  password: ' + self.password,
                            '\n  API key:  ' + self.api_key,]))
            return True
        return super(WattzOnProcessor, self).handle(e)

    def _make_call(self, meter, timestamp, magnitude):
        data = {
            'updates': [
                {
                    'timestamp': timestamp,
                    'power': {
                        'magnitude': int(magnitude), # truncated by WattzOn API
                        'unit':	'W',
                        }
                    },
                ]
            }
        url = '%s/user/%s/powermeter/%s/upload.json?key=%s' % (
            WATTZON_API_URL,
            self.username,
            urllib.quote(meter),
            self.api_key
            )
        req = self._create_request(url)
        return self.urlopener.open(req, json.dumps(data), self.timeout)

    def _create_request(self, url):
        req = super(WattzOnProcessor, self)._create_request(url)
        req.add_header("Content-Type", "application/json")
        return req


class PlotWattProcessor(UploadProcessor):
    def __init__(self, *args, **kwargs):
        super(PlotWattProcessor, self).__init__(*args, **kwargs)
        self.api_key  = kwargs.get('pw_api_key')  or PLOTWATT_API_KEY
        self.house_id = kwargs.get('pw_house_id') or PLOTWATT_HOUSE_ID
        self.map_str  = kwargs.get('pw_map')      or PLOTWATT_MAP
        self.url = PLOTWATT_BASE_URL + PLOTWATT_UPLOAD_URL
        self.process_period = PLOTWATT_UPLOAD_PERIOD
        self.timeout = PLOTWATT_TIMEOUT

        infmsg('PW: upload period: %d' % self.process_period)
        infmsg('PW: url: %s' % self.url)
        infmsg('PW: api key: %s' % self.api_key)
        infmsg('PW: house id: %s' % self.house_id)
        infmsg('PW: map: %s' % self.map_str)

    def setup(self):
        if not (self.api_key and self.house_id and self.map_str):
            print 'PlotWatt Error: Insufficient parameters'
            if not self.api_key:
                print '  No API key'
            if not self.house_id:
                print '  No house ID'
            if not self.map_str:
                print '  No mapping between channels and PlotWatt meters'
            sys.exit(1)

        self.map = pairs2dict(self.map_str)
        if not self.map:
            print 'PlotWatt Error: cannot determine channel-meter map'
            sys.exit(1)

    def process_calculated(self, packets):
        s = []
        for p in packets:
            for c in PACKET_FORMAT.channels(FILTER_PE_LABELS):
                key = p['serial'] + '_' + c
                if key in self.map:
                    meter = self.map[key]
                    # format for each meter is: meter-id,kW,gmt-timestamp
                    s.append("%s,%1.4f,%d" %
                             (meter, p[c+'_w']/1000, p['time_created']))
        if len(s):
            self._urlopen(self.url, ','.join(s))
            # FIXME: check for server response

    def _handle_urlopen_error(self, e, url, payload):
        errmsg(''.join(['%s Error: %s' % (self.__class__.__name__, e),
                        '\n  URL:      ' + url,
                        '\n  API key:  ' + self.api_key,
                        '\n  house ID: ' + self.house_id,
                        '\n  data:     ' + payload,]))

    def _create_request(self, url):
        req = super(PlotWattProcessor, self)._create_request(url)
        req.add_header("Content-Type", "text/xml")
        b64s = base64.encodestring('%s:%s' % (self.api_key, ''))[:-1]
        req.add_header("Authorization", "Basic %s" % b64s)
        return req


# format for the enersave uploads is based on the pdf document called
# 'myenersave upload api v0.8' from jan2012.
#
# the energy measurements must be sorted by timestamp from oldest to newest,
# and the value of the energy reading is a cumulative measurement of energy.
class EnerSaveProcessor(UploadProcessor):
    def __init__(self, *args, **kwargs):
        super(EnerSaveProcessor, self).__init__(*args, **kwargs)
        self.url     = kwargs.get('es_url')   or ES_URL
        self.token   = kwargs.get('es_token') or ES_TOKEN
        self.map_str = kwargs.get('es_map')   or ES_MAP
        self.process_period = ES_UPLOAD_PERIOD
        self.timeout = ES_TIMEOUT

        infmsg('ES: upload period: %d' % self.process_period)
        infmsg('ES: url: %s' % self.url)
        infmsg('ES: token: %s' % self.token)
        infmsg('ES: map: %s' % self.map_str)

    def setup(self):
        if not (self.url and self.token):
            print 'EnerSave Error: Insufficient parameters'
            if not self.url:
                print '  No URL'
            if not self.token:
                print '  No token'
            sys.exit(1)

        self.map = self.tuples2dict(self.map_str)

    def tuples2dict(self, s):
        items = s.split(',')
        m = {}
        for k,d,t in zip(items[::3], items[1::3], items[2::3]):
            m[k] = { 'desc':d, 'type':t }
        return m

    def process_calculated(self, packets):
        sensors = {}
        readings = {}
        for p in packets:
            ecm_serial = p['serial']
            if self.map:
                for c in PACKET_FORMAT.channels(FILTER_PE_LABELS):
                    key = ecm_serial + '_' + c
                    if key in self.map:
                        tpl = self.map[key]
                        dev_id = obfuscate_serial(ecm_serial) + '_' + c
                        dev_type = tpl['type'] or ES_DEFAULT_TYPE
                        dev_desc = tpl['desc'] or c
                        sensors[dev_id] = { 'type':dev_type, 'desc':dev_desc }
                        if not dev_id in readings:
                            readings[dev_id] = []
                            readings[dev_id].append('<energy time="%d" wh="%.4f"/>' %
                                                    (p['time_created'], p[c+'_wh']))
            else:
                for c in PACKET_FORMAT.channels(FILTER_PE_LABELS):
                    dev_id = obfuscate_serial(ecm_serial) + '_' + c
                    dev_type = ES_DEFAULT_TYPE
                    dev_desc = c
                    sensors[dev_id] = { 'type':dev_type, 'desc':dev_desc }
                    if not dev_id in readings:
                        readings[dev_id] = []
                        readings[dev_id].append('<energy time="%d" wh="%.4f"/>' %
                                                (p['time_created'], p[c+'_wh']))
        s = []
        for key in sensors:
            s.append('<sensor id="%s" type="%s" description="%s">' %
                     (key, sensors[key]['type'], sensors[key]['desc']))
            s.append(''.join(readings[key]))
            s.append('</sensor>')
        if len(s):
            s.insert(0, '<?xml version="1.0" encoding="UTF-8" ?>')
            s.insert(1, '<upload>')
            s.append('</upload>')
            self._urlopen(self.url, ''.join(s))
            # FIXME: check for server response

    def _handle_urlopen_error(self, e, url, payload):
        errmsg(''.join(['%s Error: %s' % (self.__class__.__name__, e),
                        '\n  URL:   ' + url,
                        '\n  token: ' + self.token,
                        '\n  data:  ' + payload,]))

    def _create_request(self, url):
        req = super(EnerSaveProcessor, self)._create_request(url)
        req.add_header("Content-Type", "application/xml")
        req.add_header("Token", self.token)
        return req


# format for the peoplepower data upload is based on the DeviceAPI web pages
# at developer.peoplepowerco.com/docs/?q=node/37 from dec2011-jan2012.
#
# the peoplepower documentation does not indicate whether energy readings
# are cumulative or delta.  this implementation uses cumulative.
class PeoplePowerProcessor(UploadProcessor):
    def __init__(self, *args, **kwargs):
        super(PeoplePowerProcessor, self).__init__(*args, **kwargs)
        self.url      = kwargs.get('pp_url')    or PPCO_URL
        self.token    = kwargs.get('pp_token')  or PPCO_TOKEN
        self.hub_id   = kwargs.get('pp_hub_id') or PPCO_HUBID
        self.map_str  = kwargs.get('pp_map')    or PPCO_MAP
        self.nonce    = PPCO_FIRST_NONCE
        self.dev_type = PPCO_DEVICE_TYPE
        self.process_period = PPCO_UPLOAD_PERIOD
        self.timeout = PPCO_TIMEOUT

        infmsg('PP: upload period: %d' % self.process_period)
        infmsg('PP: url: %s' % self.url)
        infmsg('PP: token: %s' % self.token)
        infmsg('PP: hub id: %s' % self.hub_id)
        infmsg('PP: map: %s' % self.map_str)

    def setup(self):
        if not (self.url and self.token and self.hub_id and self.map_str):
            print 'PeoplePower Error: Insufficient parameters'
            if not self.url:
                print '  No URL'
            if not self.token:
                print '  No token'
            if not self.hub_id:
                print '  No hub ID'
            if not self.map_str:
                print '  No mapping between channels and PeoplePower devices'
            sys.exit(1)

        self.map = pairs2dict(self.map_str)
        if not self.map:
            print 'PeoplePower Error: cannot determine channel-meter map'
            sys.exit(1)

        self.add_devices()

    def process_calculated(self, packets):
        s = []
        for p in packets:
            for c in PACKET_FORMAT.channels(FILTER_PE_LABELS):
                key = p['serial'] + '_' + c
                if key in self.map:
                    ts = time.strftime('%Y-%m-%dT%H:%M:%SZ',
                                       time.gmtime(p['time_created']))
                    s.append('<measure deviceId="%s" deviceType="%s" timestamp="%s">' % (self.map[key], self.dev_type, ts))
                    s.append('<param name="power" units="W">%1.4f</param>' %
                             p[c+'_w'])
                    s.append('<param name="energy" units="Wh">%1.4f</param>' %
                             p[c+'_wh'])
                    s.append('</measure>')
        if len(s):
            result = self._urlopen(self.url, s)
            resp = result.read()
            if not resp or resp.find('ACK') == -1:
                wrnmsg('PP: upload failed: %s' % resp)

    def add_devices(self):
        s = []
        for key in self.map.keys():
            s.append('<add deviceId="%s" deviceType="%s" />' %
                     (self.map[key], self.dev_type))
        if len(s):
            result = self._urlopen(self.url, s)
            resp = result.read()
            if not resp or resp.find('ACK') == -1:
                wrnmsg('PP: add devices failed: %s' % resp)

    def _urlopen(self, url, s):
        s.insert(0, '<?xml version="1.0" encoding="UTF-8" ?>')
        s.insert(1, '<h2s ver="2" hubId="%s" seq="%d">' % 
                 (self.hub_id, self.nonce))
        s.append('</h2s>')
        result = super(PeoplePowerProcessor, self)._urlopen(url, ''.join(s))
        self.nonce += 1
        return result

    def _handle_urlopen_error(self, e, url, payload):
        errmsg(''.join(['%s Error: %s' % (self.__class__.__name__, e),
                        '\n  URL:    ' + url,
                        '\n  token:  ' + self.token,
                        '\n  hub ID: ' + self.hub_id,
                        '\n  data:   ' + payload,]))

    def _create_request(self, url):
        req = super(PeoplePowerProcessor, self)._create_request(url)
        req.add_header("Content-Type", "text/xml")
        req.add_header("PPCAuthorization", "esp token=%s" % self.token)
        return req


class EragyProcessor(UploadProcessor):
    def __init__(self, *args, **kwargs):
        super(EragyProcessor, self).__init__(*args, **kwargs)
        self.url        = kwargs.get('eg_url')        or ERAGY_URL
        self.gateway_id = kwargs.get('eg_gateway_id') or ERAGY_GATEWAY_ID
        self.token      = kwargs.get('eg_token')      or ERAGY_TOKEN
        self.process_period = ERAGY_UPLOAD_PERIOD
        self.timeout = ERAGY_TIMEOUT

        infmsg('EG: upload period: %d' % self.process_period)
        infmsg('EG: url: ' + self.url)
        infmsg('EG: gateway ID: ' + self.gateway_id)
        infmsg('EG: token: ' + self.token)

    def setup(self):
        if not (self.url and self.gateway_id and self.token):
            print 'Eragy Error: Insufficient parameters'
            if not self.url:
                print '  No URL'
            if not self.gateway_id:
                print '  No gateway ID'
            if not self.token:
                print '  No token'
            sys.exit(1)

    def process_calculated(self, packets):
        s = []
        for p in packets:
            for idx,c in enumerate(PACKET_FORMAT.channels(FILTER_PE_LABELS)):
                key = obfuscate_serial(p['serial']) + '_' + c
                s.append('<MTU ID="%s"><cumulative timestamp="%s" watts="%d"/></MTU>' % (key,p['time_created'],p[c+'_w']))
        if len(s):
            s.insert(0, '<ted5000 GWID="%s" auth="%s">' %
                     (self.gateway_id, self.token))
            s.append('</ted5000>')
            result = self._urlopen(self.url, ''.join(s))
            resp = result.read()
            if not resp == '<xml>0</xml>':
                wrnmsg('EG: upload failed: %s' % resp)

    def _handle_urlopen_error(self, e, url, payload):
        errmsg(''.join(['%s Error: %s' % (self.__class__.__name__, e),
                        '\n  URL:   ' + url,
                        '\n  token: ' + self.token,
                        '\n  data:  ' + payload,]))

    def _create_request(self, url):
        req = super(EragyProcessor, self)._create_request(url)
        req.add_header("Content-Type", "text/xml")
        return req


# smart energy groups expects delta measurements for both power and energy.
# this is not a cumulative energy reading!
class SmartEnergyGroupsProcessor(UploadProcessor):
    def __init__(self, *args, **kwargs):
        super(SmartEnergyGroupsProcessor, self).__init__(*args, **kwargs)
        self.url      = kwargs.get('seg_url')    or SEG_URL
        self.token    = kwargs.get('seg_token')  or SEG_TOKEN
        self.map_str  = kwargs.get('seg_map')    or SEG_MAP
        self.process_period = SEG_UPLOAD_PERIOD
        self.timeout = SEG_TIMEOUT

        infmsg('SEG: upload period: %d' % self.process_period)
        infmsg('SEG: url: ' + self.url)
        infmsg('SEG: token: ' + self.token)
        infmsg('SEG: map: ' + self.map_str)

    def setup(self):
        if not (self.url and self.token):
            print 'SmartEnergyGroups Error: Insufficient parameters'
            if not self.url:
                print '  No URL'
            if not self.token:
                print '  No token'
            sys.exit(1)

        self.map = pairs2dict(self.map_str)
        self.urlopener = urllib2.build_opener(urllib2.HTTPHandler)

    def process_calculated(self, packets):
        for p in packets:
            ecm_serial = p['serial']
            osn = obfuscate_serial(ecm_serial)
            s = []
            if self.map:
                for idx,c in enumerate(PACKET_FORMAT.channels(FILTER_PE_LABELS)):
                    key = ecm_serial + '_' + c # str(idx+1)
                    if key in self.map:
                        meter = self.map[key] or c
                        s.append('(p_%s %1.4f)' % (meter,p[c+'_w']))
                        s.append('(e_%s %1.4f)' % (meter,p[c+'_dwh']))
            else:
                for idx,c in enumerate(PACKET_FORMAT.channels(FILTER_PE_LABELS)):
                    meter = c # str(idx+1)
                    s.append('(p_%s %1.4f)' % (meter,p[c+'_w']))
                    s.append('(e_%s %1.4f)' % (meter,p[c+'_dwh']))
                for idx,c in enumerate(PACKET_FORMAT.channels(FILTER_PULSE)):
                    meter = c # str(idx+1)
                    s.append('(n_%s %1.4f)' % (meter,p[c]))
                for idx,c in enumerate(PACKET_FORMAT.channels(FILTER_SENSOR)):
                    meter = c # str(idx+1)
                    s.append('(temperature_%s %1.4f)' % (meter,p[c]))
            if len(s):
                ts = time.strftime('%Y-%m-%dT%H:%M:%SZ',
                                   time.gmtime(p['time_created']))
                s.insert(0, 'data_post=(site %s ' % self.token)
                s.insert(1, '(node %s %s ' % (osn, ts))
                s.append(')')
                s.append(')')
                result = self._urlopen(self.url, ''.join(s))
                resp = result.read()
                resp = resp.replace('\n', '')
                if not resp == '(status ok)':
                    wrnmsg('SEG: upload failed for %s: %s' % (ecm_serial, resp))

    def _handle_urlopen_error(self, e, url, payload):
        errmsg(''.join(['%s Error: %s' % (self.__class__.__name__, e),
                        '\n  URL:   ' + url,
                        '\n  token: ' + self.token,
                        '\n  data:  ' + payload,]))

    def _create_request(self, url):
        req = super(SmartEnergyGroupsProcessor, self)._create_request(url)
        req.get_method = lambda: 'PUT'
        return req


class ThingSpeakProcessor(UploadProcessor):
    def __init__(self, *args, **kwargs):
        super(ThingSpeakProcessor, self).__init__(*args, **kwargs)
        self.url        = kwargs.get('ts_url')    or TS_URL
        self.tokens_str = kwargs.get('ts_tokens') or TS_TOKENS
        self.fields_str = kwargs.get('ts_fields') or TS_FIELDS
        self.process_period = TS_UPLOAD_PERIOD
        self.timeout = TS_TIMEOUT

        infmsg('TS: upload period: %d' % self.process_period)
        infmsg('TS: url: ' + self.url)
        infmsg('TS: tokens: ' + self.tokens_str)
        infmsg('TS: fields: ' + self.fields_str)

    def setup(self):
        if not (self.url and self.tokens_str):
            print 'ThingSpeak Error: Insufficient parameters'
            if not self.url:
                print '  No URL'
            if not self.tokens_str:
                print '  No tokens'
            sys.exit(1)

        self.tokens = pairs2dict(self.tokens_str)
        self.fields = pairs2dict(self.fields_str)

    def process_calculated(self, packets):
        for p in packets:
            ecm_serial = p['serial']
            if ecm_serial in self.tokens:
                token = self.tokens[ecm_serial]
                s = []
                for idx,c in enumerate(PACKET_FORMAT.channels(FILTER_PE_LABELS)):
                    key = ecm_serial + '_' + c
                    if not self.fields:
                        s.append('&field%d=%d' % (idx, p[c+'_w']))
                    elif key in self.fields:
                        s.append('&field%s=%d' % (self.fields[key], p[c+'_w']))
                if len(s):
                    s.insert(0, 'key=%s' % token)
#                    ts = time.strftime('%Y-%m-%dT%H:%M:%SZ',
#                                       time.gmtime(p['time_created']))
#                    s.insert(1, '&datetime=%s' % ts)
                    result = self._urlopen(self.url, ''.join(s))
                    if result:
                        resp = result.read()
                        if resp == 0:
                            wrnmsg('TS: upload failed for %s: %s' % (ecm_serial, resp))                        
                    else:
                        wrnmsg('TS: upload failed for %s' % ecm_serial)
            else:
                wrnmsg('TS: no token defined for %s' % ecm_serial)


class PachubeProcessor(UploadProcessor):
    def __init__(self, *args, **kwargs):
        super(PachubeProcessor, self).__init__(*args, **kwargs)
        self.url     = kwargs.get('pbe_url')   or PBE_URL
        self.token   = kwargs.get('pbe_token') or PBE_TOKEN
        self.feed    = kwargs.get('pbe_feed')  or PBE_FEED
        self.process_period = PBE_UPLOAD_PERIOD
        self.timeout = PBE_TIMEOUT

        infmsg('PBE: upload period: %d' % self.process_period)
        infmsg('PBE: url: ' + self.url)
        infmsg('PBE: token: ' + self.token)
        infmsg('PBE: feed: ' + self.feed)

    def setup(self):
        if not (self.url and self.token and self.feed):
            print 'Pachube Error: Insufficient parameters'
            if not self.url:
                print '  A URL is required'
            if not self.url:
                print '  A token is required'
            if not self.feed:
                print '  A feed is required'
            sys.exit(1)

    def process_calculated(self, packets):
        for p in packets:
            data = {
                'version':'1.0.0',
                'datastreams':[]
                }
#            ts = time.strftime('%Y-%m-%dT%H:%M:%SZ',
#                               time.gmtime(p['time_created']))
            for idx,c in enumerate(PACKET_FORMAT.channels(FILTER_PE_LABELS)):
                dpkey = obfuscate_serial(p['serial']) + '_' + c
#                dp = {'id':dpkey, 'at':ts, 'value':p[c+'_w']}
                dp = {'id':dpkey, 'current_value':p[c+'_w']}
                data['datastreams'].append(dp)
            if len(data['datastreams']):
                url = '%s/%s' % (self.url, self.feed)
                result = self._urlopen(url, json.dumps(data))
        # FIXME: need better error handling here

    def _create_request(self, url):
        req = super(PachubeProcessor, self)._create_request(url)
        req.add_header('X-PachubeApiKey', self.token)
        req.get_method = lambda: 'PUT'
        return req

    def _handle_urlopen_error(self, e, url, payload):
        errmsg(''.join(['%s Error: %s' % (self.__class__.__name__, e),
                        '\n  URL:   ' + url,
                        '\n  token: ' + self.token,
                        '\n  data:  ' + payload,]))


class OpenEnergyMonitorProcessor(UploadProcessor):
    def __init__(self, *args, **kwargs):
        super(OpenEnergyMonitorProcessor, self).__init__(*args, **kwargs)
        self.url     = kwargs.get('oem_url')   or OEM_URL
        self.token   = kwargs.get('oem_token') or OEM_TOKEN
        self.process_period = OEM_UPLOAD_PERIOD
        self.timeout = OEM_TIMEOUT

        infmsg('OEM: upload period: %d' % self.process_period)
        infmsg('OEM: url: ' + self.url)
        infmsg('OEM: token: ' + self.token)

    def setup(self):
        if not (self.url and self.token):
            print 'OpenEnergyMonitor Error: Insufficient parameters'
            if not self.url:
                print '  A URL is required'
            if not self.url:
                print '  A token is required'
            sys.exit(1)

    def process_calculated(self, packets):
        for p in packets:
            data = []
            for idx,c in enumerate(PACKET_FORMAT.channels(FILTER_PE_LABELS)):
                dpkey = obfuscate_serial(p['serial']) + '_' + c
                data.append('%s_w:%d' % (dpkey, p[c+'_w']))
            if len(data):
                url = '%s?apikey=%s&time=%s&json={%s}' % (
                    self.url, self.token, p['time_created'], ','.join(data))
                result = self._urlopen(url, '')

    def _create_request(self, url):
        req = super(OpenEnergyMonitorProcessor, self)._create_request(url)
        return req

    def _handle_urlopen_error(self, e, url, payload):
        errmsg(''.join(['%s Error: %s' % (self.__class__.__name__, e),
                        '\n  URL:   ' + url,
                        '\n  token: ' + self.token,
                        '\n  data:  ' + payload,]))


if __name__ == '__main__':
    parser = optparse.OptionParser(version=__version__)

    parser.add_option('-c', '--config-file', dest='configfile', help='read configuration from FILE', metavar='FILE')
    parser.add_option('-p', '--print', action='store_true', dest='print_out', default=False, help='print data to screen')
    parser.add_option('-q', '--quiet', action='store_true', dest='quiet', default=False, help='quiet output')
    parser.add_option('-v', '--verbose', action='store_false', dest='quiet', default=False, help='verbose output')
    parser.add_option('--debug', action='store_true', default=False, help='debug output')
    parser.add_option('--skip-upload', action='store_true', dest='skip_upload', default=False, help='do not upload data')

    parser.add_option('--packet-format', dest='packet_format', default=DEFAULT_PACKET_FORMAT, help='formats include '+', '.join(PACKET_FORMATS)+', default is '+DEFAULT_PACKET_FORMAT)
    parser.add_option('--db-schema', dest='db_schema', default=DEFAULT_DB_SCHEMA, help='schemas include '+', '.join(DB_SCHEMAS)+', default is '+DEFAULT_DB_SCHEMA)

    group = optparse.OptionGroup(parser, 'configuration options')
    group.add_option('--mysql-config', action='store_true', dest='mysql_config', default=False, help='configure mysql database')
    group.add_option('--sqlite-config', action='store_true', dest='sqlite_config', default=False, help='configure sqlite database')
    parser.add_option_group(group)

    group = optparse.OptionGroup(parser, 'serial source options')
    group.add_option('--serial', action='store_true', dest='serial_read', default=False, help='read from serial port')
    group.add_option('--serialport', dest='serial_port', help='serial port')
    group.add_option('--baudrate', dest='serial_baud', help='serial baud rate')
    parser.add_option_group(group)

    group = optparse.OptionGroup(parser, 'tcp/ip source options')
    group.add_option('--ip', action='store_true', dest='ip_read', default=False, help='read from TCP/IP source such as WIZnet or EtherBee')
    group.add_option('--ip-host', help='ip host')
    group.add_option('--ip-port', help='ip port')
    group.add_option('--ip-mode', help='act as client or server')
    parser.add_option_group(group)

    group = optparse.OptionGroup(parser, 'mysql source options')
    group.add_option('--mysql-src', action='store_true', dest='mysql_read', default=False, help='read from mysql database')
    group.add_option('--mysql-src-host', help='source database host')
    group.add_option('--mysql-src-user', help='source database user')
    group.add_option('--mysql-src-passwd', help='source database password')
    group.add_option('--mysql-src-database', help='source database name')
    group.add_option('--mysql-src-table', help='source database table')
    parser.add_option_group(group)

    group = optparse.OptionGroup(parser, 'sqlite source options')
    group.add_option('--sqlite-src', action='store_true', dest='sqlite_read', default=False, help='read from sqlite database')
    group.add_option('--sqlite-src-table', help='source database table')
    group.add_option('--sqlite-src-file', help='source database file')
    parser.add_option_group(group)

    group = optparse.OptionGroup(parser, 'mysql options')
    group.add_option('--mysql', action='store_true', dest='mysql_out', default=False, help='write data to mysql database')
    group.add_option('--mysql-host', help='database host')
    group.add_option('--mysql-user', help='database user')
    group.add_option('--mysql-passwd', help='database password')
    group.add_option('--mysql-database', help='database name')
    group.add_option('--mysql-table', help='database table')
    parser.add_option_group(group)

    group = optparse.OptionGroup(parser, 'sqlite options')
    group.add_option('--sqlite', action='store_true', dest='sqlite_out', default=False, help='write data to sqlite database')
    group.add_option('--sqlite-file', help='database filename')
    group.add_option('--sqlite-table', help='database table')
    parser.add_option_group(group)

    group = optparse.OptionGroup(parser, 'rrd options')
    group.add_option('--rrd', action='store_true', dest='rrd_out', default=False, help='write data to round-robin database')
    group.add_option('--rrd-dir', help='directory for rrd files')
    group.add_option('--rrd-step', help='step size in seconds')
    group.add_option('--rrd-heartbeat', help='heartbeat in seconds')
    parser.add_option_group(group)

    group = optparse.OptionGroup(parser, 'WattzOn options')
    group.add_option('--wattzon', action='store_true', dest='wattzon_out', default=False, help='upload data using WattzOn API')
    group.add_option('--wo-user', help='username')
    group.add_option('--wo-pass', help='password')
    group.add_option('--wo-api-key', help='API key')
    group.add_option('--wo-map', help='channel-to-meter mapping')
    parser.add_option_group(group)

    group = optparse.OptionGroup(parser, 'PlotWatt options')
    group.add_option('--plotwatt', action='store_true', dest='plotwatt_out', default=False, help='upload data using PlotWatt API')
    group.add_option('--pw-house-id', help='house ID')
    group.add_option('--pw-api-key', help='API key')
    group.add_option('--pw-map', help='channel-to-meter mapping')
    parser.add_option_group(group)

    group = optparse.OptionGroup(parser, 'EnerSave options')
    group.add_option('--enersave', action='store_true', dest='enersave_out', default=False, help='upload data using EnerSave API')
    group.add_option('--es-token', help='token')
    group.add_option('--es-url', help='URL')
    group.add_option('--es-map', help='channel-to-device mapping')
    parser.add_option_group(group)

    group = optparse.OptionGroup(parser, 'PeoplePower options')
    group.add_option('--peoplepower', action='store_true', dest='peoplepower_out', default=False, help='upload data using PeoplePower API')
    group.add_option('--pp-token', help='auth token')
    group.add_option('--pp-hub-id', help='hub ID')
    group.add_option('--pp-url', help='URL')
    group.add_option('--pp-map', help='channel-to-device mapping')
    parser.add_option_group(group)

    group = optparse.OptionGroup(parser, 'Eragy options')
    group.add_option('--eragy', action='store_true', dest='eragy_out', default=False, help='upload data using Eragy API')
    group.add_option('--eg-gateway-id', help='gateway id')
    group.add_option('--eg-token', help='token')
    group.add_option('--eg-url', help='URL')
    parser.add_option_group(group)

    group = optparse.OptionGroup(parser, 'Smart Energy Groups options')
    group.add_option('--smartenergygroups', action='store_true', dest='smartenergygroups_out', default=False, help='upload data using SmartEnergyGroups API')
    group.add_option('--seg-token', help='token')
    group.add_option('--seg-url', help='URL')
    group.add_option('--seg-map', help='channel-to-device mapping')
    parser.add_option_group(group)

    group = optparse.OptionGroup(parser, 'ThingSpeak options')
    group.add_option('--thingspeak', action='store_true', dest='thingspeak_out', default=False, help='upload data using ThingSpeak API')
    group.add_option('--ts-url', help='URL')
    group.add_option('--ts-tokens', help='ECM-to-ID/token mapping')
    group.add_option('--ts-fields', help='channel-to-field mapping')
    parser.add_option_group(group)

    group = optparse.OptionGroup(parser, 'Pachube options')
    group.add_option('--pachube', action='store_true', dest='pachube_out', default=False, help='upload data using Pachube API')
    group.add_option('--pbe-url', help='URL')
    group.add_option('--pbe-token', help='token')
    group.add_option('--pbe-feed', help='feed')
    parser.add_option_group(group)

    group = optparse.OptionGroup(parser, 'OpenEnergyMonitor options')
    group.add_option('--oem', action='store_true', dest='oem_out', default=False, help='upload data using OpenEnergyMonitor API')
    group.add_option('--oem-url', help='URL')
    group.add_option('--oem-token', help='token')
    parser.add_option_group(group)

    (options, args) = parser.parse_args()

    # if there is a configration file, read the parameters from file and set
    # values on the options object.
    if options.configfile:
        if not ConfigParser:
            print 'ConfigParser not loaded, cannot parse config file'
            sys.exit(1)
        config = ConfigParser.ConfigParser()
        config.read(options.configfile)
        for section in config.sections(): # section names do not matter
            for name,value in config.items(section):
                if not getattr(options, name):
                    setattr(options, name, cleanvalue(value))

    if options.quiet:
        LOGLEVEL = LOG_ERROR
    if options.debug:
        LOGLEVEL = LOG_DEBUG
    if options.skip_upload:
        SKIP_UPLOAD = 1

    if options.packet_format == PF_ECM1240BIN:
        PACKET_FORMAT = ECM1240BinaryPacket()
    elif options.packet_format == PF_ECM1220BIN:
        PACKET_FORMAT = ECM1220BinaryPacket()
    elif options.packet_format == PF_GEM48PTBIN:
        PACKET_FORMAT = GEM48PTBinaryPacket()
    elif options.packet_format == PF_GEM48PBIN:
        PACKET_FORMAT = GEM48PBinaryPacket()
    else:
        print "Unsupported packet format '%s'" % options.packet_format
        print 'supported formats include:'
        for fmt in PACKET_FORMATS:
            print '  %s' % fmt
        sys.exit(1)

    if options.db_schema == DB_SCHEMA_COUNTERS:
        DB_SCHEMA = DB_SCHEMA_COUNTERS
    elif options.db_schema == DB_SCHEMA_ECMREAD:
        DB_SCHEMA = DB_SCHEMA_ECMREAD
    elif options.db_schema == DB_SCHEMA_ECMREAD_EXTENDED:
        DB_SCHEMA = DB_SCHEMA_ECMREAD_EXTENDED
    else:
        print "Unsupported database schema '%s'" % options.db_schema
        print 'supported schemas include:'
        for fmt in DB_SCHEMAS:
            print '  %s' % fmt
        sys.exit(1)

    # Database Setup
    # run the database configurator then exit
    if options.mysql_config:
        db = MySQLConfigurator(args, **{
                'mysql_host':      options.mysql_host,
                'mysql_user':      options.mysql_user,
                'mysql_passwd':    options.mysql_passwd,
                'mysql_database':  options.mysql_database,
                'mysql_table':     options.mysql_table,
                })
        db.configure()
        sys.exit(0)
    if options.sqlite_config:
        db = SqliteConfigurator(args, **{
                'sqlite_file':     options.sqlite_file,
                'sqlite_table':    options.sqlite_table,
                })
        db.configure()
        sys.exit(0)

    # Packet Processor Setup
    if not (options.print_out or options.mysql_out or options.sqlite_out or options.rrd_out or options.wattzon_out or options.plotwatt_out or options.enersave_out or options.peoplepower_out or options.eragy_out or options.smartenergygroups_out or options.thingspeak_out or options.pachube_out or options.oem_out):
        print 'Please specify one or more processing options (or \'-h\' for help):'
        print '  --print              print to screen'
        print '  --mysql              write to mysql database'
        print '  --sqlite             write to sqlite database'
        print '  --rrd                write to round-robin database'
        print '  --wattzon            upload to WattzOn'
        print '  --plotwatt           upload to PlotWatt'
        print '  --enersave           upload to EnerSave'
        print '  --peoplepower        upload to PeoplePower'
        print '  --eragy              upload to Eragy'
        print '  --smartenergygroups  upload to SmartEnergyGroups'
        print '  --thingspeak         upload to ThingSpeak'
        print '  --pachube            upload to Pachube'
        print '  --oem                upload to OpenEnergyMonitor'
        sys.exit(1)

    procs = []

    if options.print_out:
        procs.append(PrintProcessor(args, **{ }))
    if options.mysql_out:
        procs.append(MySQLProcessor(args, **{
                    'mysql_host':      options.mysql_host,
                    'mysql_user':      options.mysql_user,
                    'mysql_passwd':    options.mysql_passwd,
                    'mysql_database':  options.mysql_database,
                    'mysql_table':     options.mysql_table,
                    }))
    if options.sqlite_out:
        procs.append(SqliteProcessor(args, **{
                    'sqlite_file':     options.sqlite_file,
                    'sqlite_table':    options.sqlite_table,
                    }))
    if options.rrd_out:
        procs.append(RRDProcessor(args, **{
                    'rrd_dir':         options.rrd_dir,
                    'rrd_step':        options.rrd_step,
                    'rrd_heartbeat':   options.rrd_heartbeat,
                    }))
    if options.wattzon_out:
        procs.append(WattzOnProcessor(args, **{
                    'wo_api_key':   options.wo_api_key,
                    'wo_user':      options.wo_user,
                    'wo_pass':      options.wo_pass,
                    'wo_map':       options.wo_map,
                    }))
    if options.plotwatt_out:
        procs.append(PlotWattProcessor(args, **{
                    'pw_api_key':   options.pw_api_key,
                    'pw_house_id':  options.pw_house_id,
                    'pw_map':       options.pw_map,
                    }))
    if options.enersave_out:
        procs.append(EnerSaveProcessor(args, **{
                    'es_token':     options.es_token,
                    'es_url':       options.es_url,
                    'es_map':       options.es_map,
                    }))
    if options.peoplepower_out:
        procs.append(PeoplePowerProcessor(args, **{
                    'pp_url':       options.pp_url,
                    'pp_token':     options.pp_token,
                    'pp_hub_id':    options.pp_hub_id,
                    'pp_map':       options.pp_map,
                    }))
    if options.eragy_out:
        procs.append(EragyProcessor(args, **{
                    'eg_url':        options.eg_url,
                    'eg_gateway_id': options.eg_gateway_id,
                    'eg_token':      options.eg_token,
                    }))
    if options.smartenergygroups_out:
        procs.append(SmartEnergyGroupsProcessor(args, **{
                    'seg_url':      options.seg_url,
                    'seg_token':    options.seg_token,
                    'seg_map':      options.seg_map,
                    }))
    if options.thingspeak_out:
        procs.append(ThingSpeakProcessor(args, **{
                    'ts_url':      options.ts_url,
                    'ts_tokens':   options.ts_tokens,
                    'ts_fields':   options.ts_fields,
                    }))
    if options.pachube_out:
        procs.append(PachubeProcessor(args, **{
                    'pbe_url':     options.pbe_url,
                    'pbe_token':   options.pbe_token,
                    'pbe_feed':    options.pbe_feed,
                    }))
    if options.oem_out:
        procs.append(OpenEnergyMonitorProcessor(args, **{
                    'oem_url':     options.oem_url,
                    'oem_token':   options.oem_token,
                    }))

    # Data Collector setup
    if options.serial_read:
        col = SerialCollector(procs,
                              options.serial_port or SERIAL_PORT,
                              options.serial_baud or SERIAL_BAUD)

    elif options.ip_read:
        if options.ip_mode \
          and not (options.ip_mode == 'client' or options.ip_mode == 'server'):
            print 'Unknown mode %s: use client or server' % options.ip_mode
            sys.exit(1)

        mode = options.ip_mode or IP_DEFAULT_MODE
        if mode == 'server':
            col = SocketServerCollector(procs,
                                        options.ip_host or IP_HOST,
                                        options.ip_port or IP_PORT)
        else:
            col = SocketClientCollector(procs,
                                        options.ip_host or IP_HOST,
                                        options.ip_port or IP_PORT)

    elif options.mysql_read:
        col = MySQLCollector(procs,
                             options.mysql_src_host or DB_HOST,
                             options.mysql_src_database or DB_DATABASE,
                             options.mysql_src_table or DB_TABLE,
                             options.mysql_src_user or DB_USER,
                             options.mysql_src_passwd or DB_PASSWD)

    elif options.sqlite_read:
        col = SqliteCollector(procs,
                              options.mysql_src_file or DB_FILE,
                              options.mysql_src_table or DB_TABLE)

    else:
        print 'Please specify a data source (or \'-h\' for help):'
        print '  --serial     read from serial'
        print '  --ip         read from TCP/IP'
        print '  --mysql-src  read from mysql database'
        print '  --sqlite-src read from sqlite database'
        sys.exit(1)

    col.run()

    sys.exit(0)
