Author Topic: Daily Log file - Is it possible to just record Min, Average and Max Values?  (Read 6523 times)

Phrog30

  • Guest
Here is a little more info on using a database.  My go to is MySQL, but for something as simple as what you are suggesting I would use SQLite instead, https://www.sqlite.org/index.html.

I think it's the best choice because there is nothing to install, it just works.  I have included a test database that I manually entered data in to.  Here are some simple queries:
1.  To get all data
Code: [Select]
select * from testtable
2.  To get min value between two dates
Code: [Select]
select id, t_stamp, min(value) from testtable
where t_stamp between '2019-03-17 1500' and '2019-03-17 1507'
3.  To get max value between two dates
Code: [Select]
select id, t_stamp, max(value) from testtable
where t_stamp between '2019-03-17 1500' and '2019-03-17 1507'
4.  To get avg between two dates
Code: [Select]
select avg(value) from testtable
where t_stamp between '2019-03-17 1500' and '2019-03-17 1507'

To me, it really isn't much simpler than that.  One thing to note, you don't need a block of data in a PLC for this , you just need to log data on change.  My point, to log data every second, or minute, etc. is wasteful.  Only log when it changes outside of a deadband you think is necessary.  So, for temp, maybe 1 degree.  The nice thing about using a database is it's really easy to query data from points of time.  Using a log file each day is fine, but what happens if you want to know weekly or monthly data?  It's tedious with a log file saved each day.  The database is very simple, just query a start and end date.

What I have done in the past is use a FIFO for sending data to the database, this acts as a store and forward, so any interruption in connectivity and you are somewhat covered.

Here are the SQLite files.  Install the browser, then open and point to the sample project and datbase.
https://drive.google.com/file/d/1d1BJe-akbPeI1HORQvMjfYLUQMRbSuv2/view?usp=sharing

You can't really mess this up, just open it up and play around with.  If you don't use it, no big deal, but you might learn something and think it's fun.

If you have any specific questions let me know.  What's nice, there is a lot more help online for database than there is for PLC.  So, if you get stuck, open google and search and you will probably find the answer easily.

« Last Edit: March 17, 2019, 03:50:58 PM by Phrog30 »

DavidSr

  • Full Member
  • ***
  • Posts: 170
    • View Profile
I think your idea is good. James I downloaded and looked at the file you created, and I would encourage Hugh to do so.


I actually agree with you  if a person wants to keep the data so it is not overwritten and do whatever with it. But I do not agree with not storing it in the PLC. It does not tax the PLC a bit to keep it there, not at all waist-full and it is the most efficient way to make it available to the HMI for display and for use by the PLC if it is needed  for any type of process control.

 
Everything he said he wants can be done with a few instructions and displayed on the HMI. If he wants to keep it indefinitely then I agree with you to store it in a DB but still keep the data file in the PLC for all new data for whatever time period he wants to display on the HMI. There is no need to take it from the PLC and put it in a DB and put it back in the HMI.

 
Also, any part of the data file can be averaged, you don't have to average the whole file, you can start and stop anywhere in the file you want. You can also get the min/max from any part of the file you want and with many PLCs it only takes one instruction each to do that.

 
We have 6 months of data for two furnaces stored in the PLC
Run Time, Total Fuel Used, Total Cost to operate per furnace and for a dryer and water heater. Computations for fuel used is based on BTU of appliance. Everything is calculated in the PLC.  (We will be adding CTs to electrical equipment as we go forward with our  Total Energy Monitoring Plan to monitor costs and plan expenditures. )

 
We take only parts of those 6 months long files to display Average fuel used per day last 30 Days and last 7 Days, and real time per day for each of the two furnaces. For now, we only get a running total for the other appliances because they don't use much.  Cost per day real time is also displayed.

 
We also take total fuel usage of all equipment and project when we will need a re-fill updated every time an appliance cycles. All the data is displayed on the HMI Updated every scan less then a second, Now that's six months of data and we have enough memory in the PLC we could do it longer but we don't need it any longer than that so why put it in a DB? We don't know what more we would do with it unless we wanted to keep it year to year.

 
If the HMI is not running- you get no data, That is the week link in using a SQL solution to store the new data. In an Industrial setting where the PLC is controlling the equipment you are monitoring, If there is data to be had, it will be in the PLC, if the PLC is not running, there is no production data to be acquired anyway until it is back in run mode, nothing lost.

 
Another advantage is if you connect a different HMI, the data is still in the PLC, all you have to do is load the config file into the HMI. And it is not HMI type dependent, You can put it on whatever HMI that is compatible with the PLC. Can't do that if the data is stored in the HMI -

 
I think you have probably done PLC work long enough to know the PLC is far more efficient at directly displaying the information on the HMI without putting a PC in the middle, /processing and putting the processed info on the HMI.  There is no way a PC based solution for that is more efficient or even close to the speed the plc does it on it's own.

 
Where I agree with you is keeping the data long term but for everything else he wants, he does not need an external db for it. As long as the PLC has enough memory to store what he wants.

 
I am grateful for your presenting this alternative and I may use it myself  if I want to keep the data to compare from year to year but I will still store the new data in the PLC which I back up daily to a network drive.

 
On a side note, I am amused at the windows people that hated Novell Netware for the command-line interface - no GUI and yet rave about SQL. ( I like NetWare)

 
I would say the best of both worlds here is both solutions together, not one or the other. So thanks for that Idea.

 
(They only keep me alive because I know DOS)
« Last Edit: March 18, 2019, 09:35:23 AM by DavidSr »
David

Phrog30

  • Guest
...not at all waist-full and it is the most efficient way to make it available...
Was referring to the database side.  It really isn't best practice to continually send the same data over and over again.  Sending on data change is really more efficient.  If you need to do interpolation you can, but generally you wouldn't need to.

On a side note, not all PLCs allow users to create big blocks of memory.  Example, a MicroLogix has a 256 element long restriction.  So that could be a deal breaker for some.   Or, some brands might not even have instructions or means to do averaging, etc. 

DavidSr

  • Full Member
  • ***
  • Posts: 170
    • View Profile

Was referring to the database side.  It really isn't best practice to continually send the same data over and over again.  Sending on data change is really more efficient.  If you need to do interpolation you can, but generally you wouldn't need to.

On a side note, not all PLCs allow users to create big blocks of memory.  Example, a MicroLogix has a 256 element long restriction.  So that could be a deal breaker for some.   Or, some brands might not even have instructions or means to do averaging, etc.
I agree....
David

Stumpy

  • Newbie
  • *
  • Posts: 19
    • View Profile
Phrog30, Sorry to bother you....

Thank you for the demo download. I have had a play with the table and supplied data seemed to work great. It looks a really simple easy tool for slicing & dicing loaded data! To this end I eagerly wanted to try my own sample data from my BasicDataLogger2 daily log file (ie raw data text file but I renamed the file extention from txt to csv allowing me to easily import into both excel and DB Browser for comparison). Note daily log file is simply 240 data samples for a few PLC items (temperatures, and other bits n bobs) ie recording every 6mins over 24hr period.

For the Min, Avg & Max aggregate function I am only applying the calculation to the 2nd column (field2) which is ambient air temperature.

I expected all result to be identical but was surprised by my findings and can only conclude I am doing something wrong?

MS excel results (which to me seem to be correct)
Min   -11.99817  04/03/2019 23:47
Avg   -7.186637
Max   -3.794922  04/03/2019 12:53 and 12:59


DB Browser results.... (Version 3.9.1, Qt Version 5.7.0,SQLCipher Version 3.11.0)
1. The Average function in DB  Browser for SQLite worked as expected in both giving -7.186637 (GREAT RESULT)
2. The Minimum function in DB Browser result = -10.09155 (which seems wrong)
3 The Maximum function in DB Browser result = -9.979401 (also seems wrong)

Using the select statement to include date & time (field1) I get

select field1, min(field2) from PLCDataLog_2019_03_04

"04/03/2019 00:17"   "-10.09155"                                  <---Row 3


select field1, max(field2) from PLCDataLog_2019_03_04

"04/03/2019 00:23"   "-9.979401"                                 <---Row 4

Oddly DB Browser seems to repeatedly point to the 3rd & 4th row of field2 to return its version Min & Max.
Great that its pointing to something that exist in my table but again oddly 1st temperature is the min out of the first four so why did it not return the 1st row result then stop??

In addition the Avg function must use the correct values for this average value to be the same as the excel calculated answer. This adds to the intrigue!

So far they are my findings regarding DB Browser Min & Max aggregation and in conclusion I am left scratching my head!

I have attached my sample data csv file for "Fun" filled problem solving (just to get an impartial view that is before contemplating a bug report).

Also I tried to use the plot window which looks very useful (unless I use select!). All the data "is" available on initial file import, but then as expected vanishes as the select command serves up only those items returned from the select request. All good so far eg select max(field2) from PLCDataLog_2019_03_04

However when select * ie "select * from PLCDataLog_2019_03_04" is used I would expect field1 (my Date & time) results to return on the plot screen but magically field1 remains missing  from the plot window but is in plain view in the results table located below the select command window.  It would appear that I am unable to plot anything against time without field1 being present in the plot window.

Not sure now wether to continue scratching my head at this point or close my gaping mouth... seems a bit baffling to me....

I did try to remove the "_" from the table/imported name in case that was some sort of undocumented qualifier to stop at row 03 & 04. It would seem not as it make no difference to my results.

Can you shed any light on the above teething problems...


Regards

Hugh

Phrog30

  • Guest
For the Min, Avg & Max aggregate function I am only applying the calculation to the 2nd column (field2) which is ambient air temperature.
I'm not following, what calculation to field2?

Also, I have never imported data, how exactly did you do it?  What is the table structure, there is a tab Database Structure, can you show the columns, the types?  I recommend creating an ID column and have it primary key and auto increment.

For testing I created another column and entered random values, I also used all negative values, my queries work fine.

Phrog30

  • Guest
I took your CSV file, added a new column "A", made this be an index 1-240, then imported into DB Browser, I used your queries, but offset by one since I added a column and the results were:
Code: [Select]
select field2, min(field3) from testMe
4/3/2019 23:47 -11.99817

Code: [Select]
select field2, max(field3) from testMe
4/3/2019 12:29 -3.794922

Looks like the results you were expecting, correct?

Stumpy

  • Newbie
  • *
  • Posts: 19
    • View Profile
Yes your results are what I was expecting.

Stumpy

  • Newbie
  • *
  • Posts: 19
    • View Profile
So I guess would each csv file needs modification to have its own  unique index column added, perhaps my assumption that date & time column  not suffice.

Also I noticed there are some options in DB Browser to create index. I can try it out.

Thank you for your prompt response!

Hugh

Phrog30

  • Guest
So I guess would each csv file needs modification to have its own  unique index column added, perhaps my assumption that date & time column  not suffice.
Not really.  You really should have at least one column that is indexed.  When you make it a primary key it is automatically indexed.  To me, it's just easier with an id column.

The queries should have worked bringing in the data as is.  Perhaps something else got messed up.

Stumpy

  • Newbie
  • *
  • Posts: 19
    • View Profile
Ok thanks for Feedback... No pain no gain as they say! I will try DB Browser update and play with it abit....

Cheers

Hugh

Stumpy

  • Newbie
  • *
  • Posts: 19
    • View Profile
Update from 3.9 to 3.11.1 on DB Browser for SQLite seems to be better on the plots but not perfect... Work in progress on their part I assume, time will tell!


DavidSr

  • Full Member
  • ***
  • Posts: 170
    • View Profile
Glad to see you are trying James solution Hugh, and you seem to be making good progress.  I downloaded it but since have seen there is a 64 bit version. I have not tried it though, I will wait to see James comment on that.
James is exactly right on having a field as an indexed field and although it does not have to be an "ID" ,  it can be any,  but you should do it that way because it most likely will be the only field in your application with unique contents.
« Last Edit: March 23, 2019, 06:24:06 AM by DavidSr »
David

Stumpy

  • Newbie
  • *
  • Posts: 19
    • View Profile
Hello David

I am sure James soon will think I am very "trying!" but where there is a will there is a way!

Just for background info...
I note what you both are saying about the index etc... The "time and date" field1 in my sample csv file is unique and  the DB Browser has an auto index option (tick box) under "Edit Pragmas" Tab, which is adjacent to Execute SQL tab) which results in row/ID from 1 to 240 which is automated and seems correct. So I am assuming the unique index should all be taken care of as the raw csv file is imported. ie Import option under File Menu and selecting the csv and accepting default options.

On the graph Plot (DB Browser ver 3.9) initially everything worked, but as I used the Execute SQL Tab to create some select commands (ie filter my data),  now graph plot literally lost the plot!  Downloading DB Browser ver 3.11.1 improved the situation but it is not perfect yet. I think a few more "Gremlins" are in their code...

This is no show stopper for me just part of my learning curve. Think I will continue to use DB Browser as engineers tool. The plot window is really quick and easy to use but keep in mind there may be a few undocumented features you may come across!

Hugh




DavidSr

  • Full Member
  • ***
  • Posts: 170
    • View Profile
Hello David

I am sure James soon will think I am very "trying!" ..............................

Hugh
James will be very helpful and as long as you are trying to work through things he won't grow weary. It's the ones that expect people here to do it all "FOR" them with little or no effort on their part that will loose support from those here on this or for that matter pretty much any support forum. James has been here a long time and has always demonstrated his willingness to help.

  I won't comment on your posts regarding how to handle the issues you raised as long as James is helping you. He is probably more capable then me in this particular application and all I will do is get in the way.

You never did tell us what PLC you are using unless I missed it. Sharing that information is critical for knowing how to help you on the PLC side of things.
David