InfluxDB, Getting Started

Post any 3rd party software here.
Post Reply
baxter
Posts: 9
Joined: Sat Apr 08, 2017 2:26 pm

InfluxDB, Getting Started

Post by baxter » Wed Jul 21, 2021 12:43 pm

I'm having trouble getting started with InfluxDB. on a Raspberry Pi. I've installed influxdb and influxdb-python. It looks like it is connecting with an admin account, but I get this when it's ready to write to the DB:

Code: Select all

2021/07/21 13:33:40 buffer info:
2021/07/21 13:33:40   01021554:   2 of 120 (1626888820)
2021/07/21 13:33:40   01021578:   2 of 120 (1626888820)
2021/07/21 13:33:40   01100487:   1 of 120 (1626888810)
2021/07/21 13:33:40 processing with InfluxDBProcessor
2021/07/21 13:33:40 2 buffered packets sn:01021554
2021/07/21 13:33:40 1 calculated packets sn:01021554
2021/07/21 13:33:40 Exception in InfluxDBProcessor: 'ch1_a'
Traceback (most recent call last):
  File "btmon3.py", line 2249, in process
    p.process_compiled(self.packet_collector.packet_buffer)
  File "btmon3.py", line 2838, in process_compiled
    self.process_calculated(packets)
  File "btmon3.py", line 4299, in process_calculated
    values['fields'][value_name] = p[c] * 1.0
KeyError: 'ch1_a'
2021/07/21 13:33:40 SOCKET: waiting for connection
^C2021/07/21 13:33:43 cleanup InfluxDBProcessor
2021/07/21 13:33:43 cleanup SocketServerCollector
2021/07/21 13:33:43 SOCKET: closing socket
The code creates the database, but nothing ever gets written. I can manually insert records from the command-line interface and read them back, but btmon doesn't write to the database. It looks like something in the buffer collection routine. The system works fine when I write to MySql.

Here's the config section FWIW:

Code: Select all

[influxdb]
influxdb_out = true
influxdb_host = localhost
influxdb_port = 8086
influxdb_upload_period = 60
influxdb_username = admin
influxdb_password = pass
influxdb_database = btmon
influxdb_measurement = energy
influxdb_mode = row
influxdb_db_schema = ecmreadext
Not sure where to go from here.
baxter
Posts: 9
Joined: Sat Apr 08, 2017 2:26 pm

Re: InfluxDB, Getting Started

Post by baxter » Sun Jul 25, 2021 8:12 pm

I managed to figure it out (I think) by walking through the btmon3 Python code. The trick is to create a massive map that maps the values coming from the GEM streams into fields that you want in the InfluxDB database. I made an assumption that if there was not a map, then the field name would just be the same as the stream name coming in.

Well, you know what they say about "assume".

Through trial and error (and reading the code, although my Python is elementary), I figured out the mapping syntax. I have four GEMs, each with 32 channels (aws and pws), plus 4 pulses, 8 temperatures, and voltage. In all, 77 data points per GEM. That's a lot to deal with, and so I created a spreadsheet with all of the channels and the names of the channels according to my circuit breaker listing. I already had a spreadsheet for the purpose of identifying all of the breakers and which GEM device/channel each one goes to. The purpose of the spreadsheet is to create a really long map string.

Then the task was to identify what was coming over in the stream, and what btmon3 was looking for. This was straightforward using the --debug flag on btmon3.

The spreadsheet looks something like this:

Code: Select all

01021554	ch1	aws	HouseL	C01	Mistress_attic		01021554_ch1_aws	HouseL_C01_Mistress_attic_aws		01021554_ch1_aws,HouseL_C01_Mistress_attic_aws,
01021554	ch1	pws	HouseL	C01	Mistress_attic		01021554_ch1_pws	HouseL_C01_Mistress_attic_pws		01021554_ch1_pws,HouseL_C01_Mistress_attic_pws,
01021554	ch2	aws	HouseL	C02	Garage_wall_outlets	01021554_ch2_aws	HouseL_C02_Garage_wall_outlets_aws	01021554_ch2_aws,HouseL_C02_Garage_wall_outlets_aws,
01021554	ch2	pws	HouseL	C02	Garage_wall_outlets	01021554_ch2_pws	HouseL_C02_Garage_wall_outlets_pws	01021554_ch2_pws,HouseL_C02_Garage_wall_outlets_pws,
01021554	ch3	aws	HouseL	C03	Hallway_outlets		01021554_ch3_aws	HouseL_C03_Hallway_outlets_aws		01021554_ch3_aws,HouseL_C03_Hallway_outlets_aws,
01021554	ch3	pws	HouseL	C03	Hallway_outlets		01021554_ch3_pws	HouseL_C03_Hallway_outlets_pws		01021554_ch3_pws,HouseL_C03_Hallway_outlets_pws,
The last column contains all of the maps. It was just a matter of pasting that into a code editor and replacing all newlines with nothing.

Putting these together, I was able to create a massive influxdb_map field that had all fields mapped. So my configuration file now looks something like this:

Code: Select all

[source]
device_type=gem
ip_read=true
ip_port=8003
ip_mode=server
[influxdb]
influxdb_out=true
influxdb_host=localhost
influxdb_port=8086
influxdb_upload_period=60
influxdb_username=admin
influxdb_password=pword
influxdb_database=btmon
influxdb_measurement=energy
influxdb_mode=row
influxdb_map=01021554_ch1_aws,HouseL_C01_Mistress_attic_aws,
01021554_ch1_pws,HouseL_C01_Mistress_attic_pws,
01021554_ch2_aws,HouseL_C02_Garage_wall_outlets_aws,
01021554_ch2_pws,HouseL_C02_Garage_wall_outlets_pws,
01021554_ch3_aws,HouseL_C03_Hallway_outlets_aws,
01021554_ch3_pws,HouseL_C03_Hallway_outlets_pws,
...etc
01021578_t7,Barn_T07_Temp_7,
01021578_t8,Barn_T08_Temp_8,
01021578_volts,Barn_V_volts
influxdb_db_schema=counters
I broke up this example to show the mapping pairs, but it's important that there be no whitespace (space, tab, newline) in the map. That screws up the parsing of the map when it loads. Also, make sure that the last entry does not have a trailing comma.

Also, there was a bug in the btmon3 software the caused a failure-to-insert-record error if a field came along that wasn't matched. That would explain the "Exception in InfluxDBProcessor: 'ch1_a'" error in the first post of this tread.

The error is in the process_calculated function around line 4280. I added a check that the key exists in the map:

Code: Select all

        for p in packets:
            dev_serial = obfuscate_serial(p['serial'])
            for c in PACKET_FORMAT.channels(self.db_schema):
                key = mklabel(p['serial'], c)
                if self.map and key not in self.map: # BT: added to avoid error later on
                    continue
I haven't tested this with other configurations, but it seems to work with InfluxDB.
ben
Site Admin
Posts: 4262
Joined: Fri Jun 04, 2010 9:39 am

Re: InfluxDB, Getting Started

Post by ben » Mon Jul 26, 2021 9:38 am

I didn't have to create a map myself when testing out the code for ecmreadext implementation. That was an older version of InfluxDB however.

What version are you running? Maybe they made some non-backwards compatible changes.
Ben
Brultech Research Inc.
E: ben(at)brultech.com
baxter
Posts: 9
Joined: Sat Apr 08, 2017 2:26 pm

Re: InfluxDB, Getting Started

Post by baxter » Mon Jul 26, 2021 10:29 am

If I don't have a field mapped, it just gets skipped.

I have InfluxDB 1.8.6. I'm using the counters schema. I haven't found any documentation on the difference between counters, ecmread, and ecmreadext, but counters seems to grab everything.

As long as all fields are mapped.
ben
Site Admin
Posts: 4262
Joined: Fri Jun 04, 2010 9:39 am

Re: InfluxDB, Getting Started

Post by ben » Mon Jul 26, 2021 3:44 pm

baxter wrote:
Mon Jul 26, 2021 10:29 am
If I don't have a field mapped, it just gets skipped.

I have InfluxDB 1.8.6. I'm using the counters schema. I haven't found any documentation on the difference between counters, ecmread, and ecmreadext, but counters seems to grab everything.

As long as all fields are mapped.
I'm pretty sure the difference is:

ecmreadext gives you amps, watts, counters, and delta kwh.
ecmread gives you amps, watts, counters.
counters just gives you the counters.
Ben
Brultech Research Inc.
E: ben(at)brultech.com
baxter
Posts: 9
Joined: Sat Apr 08, 2017 2:26 pm

Re: InfluxDB, Getting Started

Post by baxter » Tue Jul 27, 2021 9:45 am

That's helpful. Thanks, Ben!

Now I can see four more facets. Total now 820 mapped fields (205 per GEM). Does that sound about right?
ben
Site Admin
Posts: 4262
Joined: Fri Jun 04, 2010 9:39 am

Re: InfluxDB, Getting Started

Post by ben » Tue Jul 27, 2021 11:34 am

baxter wrote:
Tue Jul 27, 2021 9:45 am
That's helpful. Thanks, Ben!

Now I can see four more facets. Total now 820 mapped fields (205 per GEM). Does that sound about right?

Code: Select all

elif fltr == FILTER_DB_SCHEMA_ECMREADEXT:
            c = ['volts']
            for x in range(1, self.NUM_CHAN + 1):
                c.append('ch%d_a' % x)
            for x in range(1, self.NUM_CHAN + 1):
                c.append('ch%d_w' % x)
            for x in range(1, self.NUM_CHAN + 1):
                c.append('ch%d_wh' % x)
            for x in range(1, self.NUM_CHAN + 1):
                c.append('ch%d_dwh' % x)
            for x in range(1, self.NUM_PULSE + 1):
                c.append('p%d' % x)
            for x in range(1, self.NUM_SENSE + 1):
                c.append('t%d' % x)
 
I think there should be only 141 with the above information.

Code: Select all

 elif fltr == FILTER_DB_SCHEMA_COUNTERS:
            c = ['volts']
            for x in range(1, self.NUM_CHAN + 1):
                c.append('ch%d_aws' % x)
                c.append('ch%d_pws' % x)
            for x in range(1, self.NUM_PULSE + 1):
                c.append('p%d' % x)
            for x in range(1, self.NUM_SENSE + 1):
                c.append('t%d' % x)
The counters schema would've added the aws/pws fields (another 64 values giving the 205) above (which may not be needed if using ecmreadext as they're translated into watts/kwh.

If you want to combine the two you may be able to edit below line 1937 and add the aws/pws.
Ben
Brultech Research Inc.
E: ben(at)brultech.com
baxter
Posts: 9
Joined: Sat Apr 08, 2017 2:26 pm

Re: InfluxDB, Getting Started

Post by baxter » Thu Jul 29, 2021 9:01 am

Thanks, Ben, that was helpful. Not sure exactly what I'll need until I can get Grafana up and running. Best to have it all and then strip back later.

Now I have another strange issue. The monitor just stops after a while. No errors thrown, no indication of why. Here's the last set with the --debug option set:

Code: Select all

2021/07/28 12:37:57 waiting for data from device
2021/07/28 12:37:57 reading 1 of 1 packets
2021/07/28 12:38:01 SOCKET: read 1 of 1 bytes from socket: b'fe'
2021/07/28 12:38:01 SOCKET: read 1 of 1 bytes from socket: b'ff'
2021/07/28 12:38:01 SOCKET: read 1 of 1 bytes from socket: b'05'
2021/07/28 12:38:01 SOCKET: read 619 of 619 bytes from socket: b'04e06f4bf40e01b6f8f717016f82187f01e6093d5800ccbdbba400dd833310000bf2c8af00d819822800e294000000cd29a30d00e141600700e5f5de0f0024b4270000253bd601001b98000000c739260700e0ed320d0022e3dd0500acd5c62600ba548a080092a76f1700525cfc0100b4a188cb06c57da9c50665aff500006f10fa0600b7008517008a36010000cd518501002c93cb030018245d0900c03d4c090000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000af721ac30685ee862f0600000000000000000000000000000000000000000000000000000000000000000000000000000000200062010b0500005e02000051040000000000000000230000000000000000000500000006001e0061000800813261320500ce01a400000000000000ca00c900100f0e0d0c0b0a090807060504030201544a000a2500920520020000150100003e020000000000000000180000000000000000000900000006001100300009005b0a550a0600f1004f0000000000000057005700000700801b00801800800100800c0080280080120080004109809662809566015cbb380000000000000000000000000002000000000000000000000000000015071c0c251b'
2021/07/28 12:38:01 SOCKET: read 1 of 1 bytes from socket: b'ff'
2021/07/28 12:38:01 SOCKET: read 1 of 1 bytes from socket: b'fe'
2021/07/28 12:38:01 SOCKET: read 1 of 1 bytes from socket: b'36'
2021/07/28 12:38:01 buffering packet ts:1627490281 sn:01021578
2021/07/28 12:38:01 SOCKET: closing connection
2021/07/28 12:38:01 buffer info:
2021/07/28 12:38:01   01021578:  29 of 120 (1627490281)
2021/07/28 12:38:01   01021554:  29 of 120 (1627490277)
2021/07/28 12:38:01   01100487:  28 of 120 (1627490269)
2021/07/28 12:38:01 processing with InfluxDBProcessor
2021/07/28 12:38:01 waiting 14 seconds to process packets for 01021578
2021/07/28 12:38:01 waiting 10 seconds to process packets for 01021554
2021/07/28 12:38:01 waiting 18 seconds to process packets for 01100487
2021/07/28 12:38:01 SOCKET: waiting for connection
2021/07/28 12:38:01 waiting for data from device
2021/07/28 12:38:01 reading 1 of 1 packets
2021/07/29 09:43:58 SOCKET: closing connection
2021/07/29 09:43:58 cleanup InfluxDBProcessor
2021/07/29 09:43:58 cleanup SocketServerCollector
2021/07/29 09:43:58 SOCKET: closing socket
It's the last four lines. After "reading 1 of 1 packets", it usually reads the header, then the data, then the footer. But, instead, it just closes the connection, cleans up, and closes the socket.

And then nothing more gets logged.

Why would it do such a thing? And how to make it recover?
Post Reply