Forcing WordPress Back to Allowing Classic Editor

Having had my last post trashed by WordPress’s block editor, I have been avoiding using WordPress at all; however, yesterday I needed to go have a look at a post while using my laptop (which I almost never use with WordPress).

When I got into the Posts view, the posts were being displayed in the old format. Lo and behold the Add | Classic Mode option was again available. Perplexing because I am sure there was no method to bring classic mode up in the past several months. I know I tried.

Then I noticed the ‘screen options’ button and found that would force WordPress back into the new way of displaying.

Went back to my desktop and now it too was magically back in Classic View mode.

I have to assume that has done something recent to make this option available again. I don’t know how long it will last as it has been my understanding they are getting rid of classic mode.

Nevertheless, here is how I switched from showing posts in the Default View (where Classic Mode is NOT available), to showing posts in the Classic View (where it is available).

Select ‘Posts’ from the left column and you will see posts listed the new way with a Screen Options drop down in the upper right corner. Click on this:

This will show the two display formats. Select classic view:

Now to add a new post using the classic editor, click on the drop down arrow and select Classic Editor.

As I said, I am nearly positive this was not available a few months ago. I’m still skeptical that the Classic Editor will be yanked away from me again so I’m not any too excited about posting anything else here.

Fool me once, shame on you. Fool me twice, shame on me.




Posted in c-Misc | Leave a comment

PMS5000 Air Quality Monitor Part 4 – Installing the Hardware (AKA WordPress BLOCK Mode F****** me)

I wrote a complete explanation of how I installed the PMS5000 and WordPress decided to discard it for me even though I was saving drafts every couple of paragraphs.

A few months ago WordPress decided to drop support for their ‘classic mode’ method for creating posts. Now you have to use their slick as chicken shit block mode editor or whatever the hell they call it. While parts of the new editor are easier to use, I find myself struggling often to do the simplest of things that I already knew how to do in classic mode.

As I have tried to explain to others many many times NEWER rarely means BETTER. It just means DIFFERENT and quite possibly WORSE.

WordPress, along with way too many other products continue to to spend a huge time working on form over function and the end result is a product that is far worse than what they started with.

I am so disappointed in loosing this post that it may very well by my last. I see no point in spending an hour or more writing something that has a small audience only to have it sent to the bit bucket. I keep copious notes in my own project file and I can refer back to those rather than attempt to share my projects here.


Posted in c-electronics | Tagged , , , | 2 Comments

PMS5003 Air Quality Monitor Part 3 – Data Presentation

After getting a week or so worth of data it was time to decide how I wanted to view it. I knew I wanted to see the output on a web page. The software for this project is going to be installed on the same Raspberry Pi I use to monitor my weather station. That already supports Apache so it will be easy enough to just create a new web page for Air Quality.

In the back of my head, I kind of knew how I wanted to see data – a series of graphs: one for today, one for the week, etc.

I started by extracting data from the SQLITE db in CSV and using LibreOffice’s Calc program to create Charts of what I thought I wanted to see. This gave me a good opportunity to see what queries I would need and how I would need to summarize the data.

For example, while I could produce a nice daily report using every data point in the database, that was too much data for the 30 day report. That data needed to be averaged. But when I averaged the data I lost the max values. After some experimenting I decided most of the reports needed the data points to be averaged, but then I would also graph the max for each each averaged sample.

Writing a script to extract data in CSV format using SQLITE3 was easy enough. But how to produce a graph for a web page?

Many years ago I had to do the same thing for a massive network monitoring project. Network performance data was summarized from a MySQL database and then gnuplot used to produce JPG files of graphs. I don’t know if that is still the best way to handle generating graphs of SQL data, but I know it works so that is how I proceeded.

Some Program Changes

While experimenting with the existing Pascal program, I found a couple of changes needed to be made from what I posted prior. The Pascal program needed to flush console output because now it’s output is to a log file that I monitor with the linux tail -f command.

The database was modified to correct a fieldname and an index was added to the data field since we will always be searching data based on date.

Further, a new script, was created to allow operation from cron. This script will verify the aqmonitor is not already running and will fixup the LCK..ttyUSB0 problem if it exists.

The new source code is here:

Building a Script to Graph 24 Hours of AQ Data

After messing with Calc, my goal was to have a graph roughly like this, except I would report only PM2.5:

I first created a CSV with pm2.5 and pm10 columns (though I will only chart the pm2.5):

# create and execute query that will output last 24 hours of data
# into aq24h.csv file

cat <<EOF | sqlite3 aqmonitor.db
.headers on
.mode csv
.output aq24h.csv
  strftime('%Y-%m-%d %H:%M', date) as date, pm25Env,
  (pm100Env-pm25Env) as pm100EOnly
from observations
where date > datetime('now','localtime','-24 hours')
order by date;

cat <<EOF | sqlite3 aqmonitor.db sends all of the commands until EOF is found to sqlite. Of these commands: .headers/.mode/.output create the proper CSV output file. Then the actual query occurs.

One thing about SQLITE that is a bit disconcerting. When you use the date ‘now’ it is always GMT. The data is the db is all store in local time. Therefore I must always use the ‘localtime’ modifier.

This creates the proper CSV file:

"2021-06-08 10:41",0,0
"2021-06-08 10:46",0,0
"2021-06-08 10:51",0,0
"2021-06-08 10:56",0,0
"2021-06-08 11:01",0,0
"2021-06-08 11:06",0,0

I created another query to will save the Maximum PM value in the bash variable ${maxPM} during this time period because I want to show that on the graph as well:

# This query determines the max Y value that will be plotted so we can 
# handle labels cleanly in the plot

maxPM=$(cat <<EOF | sqlite3 aqmonitor.db
  max(max(pm25Env), max(pm100Env-pm25Env)) as maxPM
from observations
where date > datetime('now','localtime','-24 hours')

I used gnuplot to generate the graph as a .PNG file. There is a lot going on in gnuplot so I will try to break it down. Overall, I feed commands to gnuplot the same way I fed them to sqlite3:

cat <<EOF | gnuplot 

Here is an explanation of the commands I sent to gnuplot. Note that I haven’t used gnuplot in over a decade. I found examples of what I needed online and changed them for my purposes. There may well be better ways and my understanding of what I’m doing here is limited.

I started by setting the gnuplot variable maxPM to bash’s ${maxPM} variable. I then created label 99 using that data. Next the CSV separator was specified and the graph’s Title defined:

    # display Max obs above and right of the graph
    set label 99 sprintf("Maximum Observation: %d", maxPM) at graph 1, graph 1 right offset 0, char 1
    set datafile separator ','							#CSV field separator
    set title "Particulate Matter for Last 24 Hours" font ",20"			#header

The X axis was formatted for using dates:

    set xdata time 								#tells gnuplot the x axis is time data
    set timefmt "%Y-%m-%d %H:%M" 						#specify our time string format
    set format x "%H:%M" 							#otherwise it will show only MM:SS

Here, the Y axis was labeled and the legend box defined:

    set ylabel "PM (µgrams / meter^{3})" 					#label for the Y axis    
    set ytics nomirror								#no Y ticks at top

    # setup legend box
    set style line 100 lt 1 lc rgb "dark-grey" lw 0.5                           # linestyle for the grid
    set grid ls 100 front                                                       # enable grid with specific linestyle

Here’s where it get’s a bit tricky. There are a series of ranges defined for PM2.5. For example, Good is 0 – 12 micro-grams/cubic meter:

In my graph, I want colored bars in the background for these ranges. Here, I defined each bar as a rectangle and gave it a color:

    # create colored background of bars 
    set object 50 rect from graph 0, graph 0  to graph 1, first 12        fc rgb "green"  lw 0 
    set object 51 rect from graph 0, first 12 to graph 1, first 35.4      fc rgb "yellow" lw 0
    set object 52 rect from graph 0, first 35.4  to graph 1, first 55.4   fc rgb "orange" lw 0
    set object 53 rect from graph 0, first 55.4  to graph 1, first 150.4  fc rgb "red"    lw 0
    set object 54 rect from graph 0, first 150.4 to graph 1, first 250.4  fc rgb "purple" lw 0
    set object 55 rect from graph 0, first 250.4 to graph 1, graph 1      fc rgb "brown"  lw 0

Further, I want to put text into each colored rectangle: Good, Moderate, etc. When I did this I had a lot of trouble because the text is a fixed font size and each colored bar would grow/shrink depending on the maximum values being graphed.

To deal with the problem, I decided to only display the colored bars that have data for them. To do that, I formatted the yrange and text based on the value of the maxPM variable:

    # Labels for the colored bars a defined based on which bars are actually displayed
    if (maxPM  12 && maxPM  35 && maxPM  55 && maxPM  150 && maxPM  250) {
      set yrange[0:500<*]
      set label 53 "Unhealthy"			    at graph 0.5, first 100  center tc rgb "black"
      set label 54 "Very Unhealthy"                 at graph 0.5, first 200  center tc rgb "black"
      if (maxPM <= 500) {
        set label 55 "Hazardous"                    at graph 0.5, first (500-250)/2+250    center tc rgb "black"
      else {
        set label 55 "Hazardous"	            at graph 0.5, first (maxPM-250)/2+250  center tc rgb "black" #Y position calculated to be centered

Finally, I setup the line styles for the plot, characteristics of the output graph (a PNG file), and do the actual plot:

    set style line 101               lw 1 lt rgb "black"                        #line color for PM2.5
    set style line 102 dashtype "-." lw 1 lt rgb "blue"				#line color for PM10
    # output graphic definitions
    set terminal pngcairo size 800,600 
    set output "aq24h.png"

    plot 'aq24h.csv' using 1:2 with lines ls 101 title 'PM2.5'#,\
         #''         using 1:3 with lines ls 102 title 'PM10'

When I run this for my current 24 hours, which has a maximum of 6 micro-grams/cubic meter I see:

Here, I tweak the data to create a single PM2.5 observation of 400. You can see how all of the appropriate color bars are added and the labels are removed from the skinny color bars:

Using this script as an example, I then proceeded to create scripts to produce plots for 7 days, 30 days, and 1 year.

Creating the Web Page

Now I have 4 graphs to display, plus a file that contains just text of the current time and the last observation recorded. I need an HTML file that ties this all together into a single web page.

Here is that page. The only thing even remotely tricky about this HTML file is using javascript to include the file aqCurrent.htm which contains the current observation.

Because the examples of the page I show are not yet on a web server, you won’t see this included file. But it will show up once I have the code running on a web server.

<!DOCTYPE html>
    BigDanz Air Quality Monitor
    BigDanz Air Quality Monitor
    <table border=0>
        <td align=center colspan="2">
          <div id="includedContent"></div>
          <img src="aq24h.png" style="max-width: 100%; height: auto">
          <img src="aq7d.png"  style="max-width: 100%; height: auto">
          <img src="aq30d.png" style="max-width: 100%; height: auto">
          <img src="aq1y.png"  style="max-width: 100%; height: auto">

Here is what the web page looks like. Note that the Last Year graph is a little funky because the average is for a full day, but there are only a few days of data.

The HTML file and the scripts to extract data and generate graphs are here:

In this project, I ended up graphing only PM2.5. I tried showing both on the graph, but the graph was just too sloppy. From what I’ve read, excessive PM2.5 is more critical than PM10, because it can embed itself further in the lungs. So I’ve focused only on PM2.5.

Posted in c-electronics, c-lazarus | Tagged , , , | Leave a comment

PMS5003 Air Quality Monitor Part 2 – Raspberry Pi and SQLite

After watching my little program monitor the PMS5003 air monitor properly for a few days, it was time to move on to the next steps: migrating the program to a Raspberry Pie and writing data to an SQLite database.

Moving to Raspberry PI (RPI)

The nice thing about Lazaras / Free Pascal is there is very little to do to move a program between architectures – just recompile the source on the ARM architecture.

Typically that works correctly, but the synapse library I’m using for serial I/O had a few issues recompiling on the ARM platform. Primarily it thinks there are some baud rates available that are not. Easy enough to just comment those out.

Connecting the FTDI breakout board was pretty easy as well. All drivers were already installed. After you plug the USB cable into the raspberry pi, use dmsg | grep FTDI to see the port assigned:

You can see that /dev/ttyUSB0 was assigned.

As in part 1, I used putty on the RPI to verify I was getting output from the PMS5003 sensor before going any further.

The only issue with the program written in part 1 was it was hardcoded for ports named COM. I changed the program to allow input of a string rather than a digit so I could use /dev/ttyUSB0 rather than just COM<n>.

With that change, I was receiving sensor data just as I had been with windows:

Allowing non-root Access to /dev/ttyUSB0

One difference with Linux is root access is required to use /dev/ttyUSB0. It is easy enough to allow your user access using usermod:

usermod -a -G dialout $USER

Fixing Control-C Issues

While playing around with the program, I quickly realized there was a very annoying problem. In Linux, if I use control-C to abort the program (which is the only way out), it will not properly close the serial port, making it impossible to restart the program.

When this happens you will see a message like this:

Communication error 9991: Port owned by other process
Unable to open communications port. Terminating.

At first, I was forced to reboot the RPI when I received this error, but I found that if you look at the /run/lock directory you will find a file called LCK..ttyUSB0. Delete this file and you will have access to the ttyUSB0 device again.

Although I used try/except to catch any program errors, control-C doesn’t get trapped. In Linux, the program just exits and leaves a mess unlike the Windows version.

To get around this problem, I trap for several Linux signals as the program starts:

// trap for signals that could cause abnormal termination so the com port
// can be properly closed
if FpSignal(SIGINT, @HandleSigInt) = signalhandler(SIG_ERR) then begin
    Writeln('Failed to install signal error: ', fpGetErrno);
if FpSignal(SIGHUP, @HandleSigInt) = signalhandler(SIG_ERR) then begin
    Writeln('Failed to install signal error: ', fpGetErrno);
if FpSignal(SIGTERM, @HandleSigInt) = signalhandler(SIG_ERR) then begin
    Writeln('Failed to install signal error: ', fpGetErrno);

These all install the same trap handler for all of the traps:

procedure handleSigInt(
    aSignal                             : LongInt


writeln('User requested program termination.');
abortRequested := true;

end; // handleSigInt

Then, the infinite loop that monitors the sensor output watches for abortedRequested to become true:

    while not abortRequested do begin

Installing SQLite3

I needed to install SQLite:

sudo apt install sqlite3 libsqlite3-dev

I also like to use the GUI SQL browser, so I installed that:

sudo apt install sqlitebrowser

Creating the Database

I created the database as follows:

>sqlite3 aqmonitor.db
SQLite version 3.27.2 2019-02-25 16:06:06
Enter ".help" for usage hints.
sqlite> CREATE TABLE observations (
   ...>                       NOT NULL
   ...>                       UNIQUE,
   ...>     date     DATETIME NOT NULL,
   ...>     duration INTEGER  NOT NULL,
   ...>     pm10Std  INTEGER  NOT NULL,
   ...>     pm25Std  INTEGER  NOT NULL,
   ...>     pm100Std INTEGER  NOT NULL,
   ...>     pm10Env  INTEGER  NOT NULL,
   ...>     pm25Env  INTEGER  NOT NULL,
   ...>     pm100E   INTEGER  NOT NULL,
   ...>     part03   INTEGER  NOT NULL,
   ...>     part05   INTEGER  NOT NULL,
   ...>     part10   INTEGER  NOT NULL,
   ...>     part25   INTEGER  NOT NULL,
   ...>     part50   INTEGER  NOT NULL,
   ...>     part100  INTEGER  NOT NULL
   ...> );
sqlite> .schema
CREATE TABLE observations (
                      NOT NULL
    date     DATETIME NOT NULL,
    duration INTEGER  NOT NULL,
    pm10Std  INTEGER  NOT NULL,
    pm25Std  INTEGER  NOT NULL,
    pm100Std INTEGER  NOT NULL,
    pm10Env  INTEGER  NOT NULL,
    pm25Env  INTEGER  NOT NULL,
    pm100E   INTEGER  NOT NULL,
    part03   INTEGER  NOT NULL,
    part05   INTEGER  NOT NULL,
    part10   INTEGER  NOT NULL,
    part25   INTEGER  NOT NULL,
    part50   INTEGER  NOT NULL,
    part100  INTEGER  NOT NULL
CREATE TABLE sqlite_sequence(name,seq);
sqlite> .quit

This creates fields for all of the data from the sensor. It also has a unique id which is the primary key. If you need to delete/modify a specific record, the Id uniquely identifies the record.

There are also fields for the date&time of the observation, and the length of time observations were averaged to get the record.

Modifying the Program to Create Database Records

SQLite access is easily integrated into Lazarus / Free Pascal using the db, sqldb, and sqlite3conn modules:

I added 3 functions to my original aqMonitor program to handle writing data to the database: dbOpen, dbClose, and dbAdd.

Opening and closing the db is very straightforward so I won’t cover that here. Adding is done each time we compute an average of the observations. When the timer pops, a new average is computed and we write it to the database using dbAdd:

if SecondsBetween(now, avgTimer) >= duration * 60 then begin        
    packet := historyAvg(duration * 60, history);                   
    dbAdd(duration, packet);                                    
    avgTimer := now;                                                

dbAdd is fairly straightforward as well. It creates an SQL insert statement, then executes it against the database.

The insert command is put into the variable s:

with packet do begin
    s := 'insert into observations (' +
            'date, duration, pm10Std, pm25Std, pm100Std, pm10Env, pm25Env, pm100E, ' +
            'part03, part05, part10, part25, part50, part100' +
            ')' +
        'values(' +
            '"' + FormatDateTime('yy-mm-dd hh:nn:ss', now) + '", ' +
            inttostr(duration) + ', ' +
            inttostr(pm10Std) + ', ' +
            inttostr(pm25Std) + ', ' +
            inttostr(pm100Std) + ', ' +
            inttostr(pm10Env) + ', ' +
            inttostr(pm25Env) + ', ' +
            inttostr(pm100Env) + ', ' +
            inttostr(particles03um) + ', ' +
            inttostr(particles05um) + ', ' +
            inttostr(particles10um) + ', ' +
            inttostr(particles25um) + ', ' +
            inttostr(particles50um) + ', ' +
            inttostr(particles100um) +
    end; // with

Once the string is created, a start transaction is executed (trans), the query is created (q), and finally executed (q.ExecSQL). Exceptions are handled, and if the Insert succeeds, everything is cleaned up in the finally block:

try try
    trans             := TSQLTransaction.Create(nil);
    trans.DataBase    := dbCB;

    q                 := TSQLQuery.Create(nil);
    q.DataBase        := dbCB;
    q.SQL.Text        := s;

    on e: EDatabaseError do begin
        writeln('Error: ' + e.Message);
        writeln('insert failed.');
    end; // try except;

    end; // try finally

Using sqlitebrowser to look at the table:

As I said, writing data to an SQL database is pretty easy.

Handling SQLite Locks

Well, there is one problem with SQLite – database locks. I’ve used SQLite pretty extensively and I like it a lot, but it is pretty stupid when it comes to locking, at least as far as I’m concerned. If it can’t obtain a lock, it will give up and give an error. There is no option to just wait until the other writer released the lock.

On a program like this, I do not want it aborting except for truly exceptional reasons. It is going to be running via cron in the background and I don’t intend to have to babysit it!

In other programs, I get around locking issues by locking my own semaphore before using SQLite. My semaphore lock would wait indefinitely rather than abort the program. That causes all database accessors to be singly threaded thru the semaphore before being allowed access to the database.

But that is too complicated to implement here, and requires all accessors use the semaphore which makes using tools like sqlitebrowser a bad idea.

Since the aqmonitor program has a single insert SQL command, it is fairly easy to simply do several retries myself. So rather than the simple q.ExecSQL I showed you above, what I actually do is this:

    // attempt to insert record. If dblocked occurs (error 5), then retry until
    // succeeds or # of retries are exceeded
    retries := 0;
    while true do begin
        try // insert
            on e: ESQLDatabaseError do begin                                
                if e.ErrorCode = 5 then begin                               
                    retries := retries + 1;
                    if retries > maxRetries then begin                      
                        writeln('Insert failed due to dblock after ', retries,
                            ' attempts.');
                        raise EInsertFailed.Create                          
                            ('Insert retries exceeded due to dblock');
                    sleep(retries * 1000);                                  
                end; // on
            end; //try insert
        end; // while

I try the insert (q.ExecSQL). If an exception occurs, I check to see if it is a lock error (errorCode = 5). If it is, I add 1 to the retries count, sleep for a while, then try again. If the number of retries is exceeded, then I raise EInsertFailed to let the caller know I gave up trying to add the record.

Further, in the while loop where dbAdd is called, I count the number of successive dbAdds that fail. If dbAdd fails 10 times in a row, then something is seriously wrong and I abort the program.

This concludes the data capture part of my Air Quality monitor.

The source the the program as it stands now, with the SQL code can be found at:

I’ll let this run for a few days to see if I can get dirtier air than I’ve had the past couple of days. Then I’ll start working on reading the database and generating some graphs to post on a website.

Posted in c-electronics, c-lazarus | Tagged , , , | 2 Comments

Reading PMS5003 Air Quality Sensors with Windows and Free Pascal

Some time in the past several months, the PMS5003 Air quality sensor came to my attention. This sensor allows you to monitor PM1.0, PM2.5, and PM10 particles in the air.

Some times the air quality around these parts gets quite bad due to dust storms or fires. Not only would I like to know just how bad the AQ (air quality) is at my house, I’d like to keep history. I’d also like to send myself a notification if it gets too bad so I know I need to shut up the house and turn on the HEPA filter.

I purchased this sensor at AdaFruit:

At $40, it isn’t cheap, but it is a pretty complex little device.

To start with, I wanted to play with this sensor using a windows PC. I needed a FTDI cable to do so. The FTDI cable converts a USB cable to the TTL serial protocol used by this board. Some quick research and I found this guy had connected the PMS5003 to his PC using an AdaFruit FT232H breakout board which can be found here:

I already cover how to connect this FTDI breakout board to your PC here:

Installing Adafruit FT232H Breakout of the FT232H USB to Serial Converter

Connecting AQ Sensor to FTDI Breakout Board

Since the sensor communicates via a normal serial interface, connection is very simple. You just need to connect power, ground, and TX & RX.

On the FTDI breakout, D0 is TX and D1 is RX. So connect D0 to RX and D1 to TX of the PMS board. For my project, I don’t intend to transmit to the sensor, but I’ll still connect the lead.

To see if you got the connections correct, just use putty to connect to the FTDI cable and you will see ‘garabage’ being transmitted. The data being transmitted is in binary so you won’t see anything useful at this point.

Reading and Formatting the Sensor Output

Once the sensor was functioning, I needed to read the binary values and  convert them into something useful.

Further, having the data spit out every second wasn’t real useful. I decided I needed to collect 5 minutes of data, average it, and output the average to get a better sense of what the air quality really was.

The program I wrote to do this is in Lazarus / Free Pascal. If you’ve seen my other posts, you know this is my language of choice. If you know C/C++ you won’t have much trouble reading the source code to understand how to implement the code in C.

The Pascal source and executable can be found at:

The Pascal source code for the program is in the file aqmonitor.lpr.

If you wish to compile this program, you will also need the synapse ‘synaser’ serial library which can be found at

Program Highlights

The sensor outputs a packet of data about once a second. I read that data and put it into this data structure:

aqPacketT                               =   record
    hdr1                                :   byte;
    hdr2                                :   byte;
    frameLen                            :   word;
    pm10Std                             :   word;
    pm25Std                             :   word;
    pm100Std                            :   word;
    pm10Env                             :   word;
    pm25Env                             :   word;
    pm100Env                            :   word;
    particles03um                       :   word;
    particles05um                       :   word;
    particles10um                       :   word;
    particles25um                       :   word;
    particles50um                       :   word;
    particles100um                      :   word;
    filler                              :   word;
    checksum                            :   word;
    end; // aqPacketT

In Pascal, ‘word’ is a 16bit unsigned integer.

There are 2 categories of ‘particulate matter’: Standard and environmental. So pm10Std is the PM1.0 standard data and pm10Env is the PM1.0 environmental data.

From others’ discussions, it appears that the standard values have something to do with calibration and it is the atmospheric environment values that should be used for actual measurements.

The variables for PM1.0, PM2.5, and PM10 will contain data for all particles <= that size. If it senses values for PM1.0, then those values will also show up in the PM2.5 and PM10 variables.

To create the data structure, I read the data stream until I see the ox42 and ox4D header bytes. Once I have those, I start reading data byte by byte and creating the data structure.

    pm10Std           := getword(comCB, calcChecksum);
    pm25Std           := getword(comCB, calcChecksum);
    pm100Std          := getword(comCB, calcChecksum);

As I’m populating the data structure, I’m also keeping track of the checksum so I can compare at the end of the packet. If they don’t match, I generate an exception. If the program gets 10 checksum errors in a row, it will assume a serious problem and abort.

    if finalChecksum <> checksum then begin
        // checksums don't match, throw exception
        raise EChecksum.Create(erMsgChecksum);

As each packet is read, it is added to a FIFO list which I have set to hold 360 samples (about 6 minutes) of data:

    historyAdd(history, packet);

Then every 5 minutes, I compute the average of all of the values for the past 5 minutes (300 seconds), and print those averages:

    if SecondsBetween(now, avgTimer) >= 300 then begin
        packet := historyAvg(300, history);
        avgTimer := now;

Here is an example of the program’s output. The air has been very clean here in my office since I started the monitor last night, so I lit a match at about 11:53 and let it smoke the room a bit:

Here is a much longer run from a few days ago when the wind was kicking up dust, then a rain storm cleaned the air. This graphs just the PM2.5 values:

Next Up

The plan is to migrate this project to a dedicated Raspberry Pi. The monitor program used here will be modified to write the data captured to a SQLite database. Once I have the data in a database, another program will periodically query the data and generate some graphics for a web page. Finally, I will need to fabricate some kind of enclosure.

Additional Resources

I am planning to use a USB cable and the documented FTDI breakout board for my final project because the Raspberry Pi will be in my garage and the sensor outside. It will be a long enough run of cable between the two, I don’t want to use straight serial TTL protocol. But if you want to fashion your own cable, this guy documents the correct connector you will need:

This guy connects the PMS5003 to an arduino and reports the current AQ on an LED display:

Pretty good article on the performance of the sensor:

Adafruit post regarding difference between Standard and Environmental values:

Posted in c-electronics, c-lazarus | Tagged , , , | Leave a comment

Installing Adafruit FT232H Breakout of the FT232H USB to Serial Converter

I recently purchased an air quality monitor I want to play with. It uses a TTL serial interface. While I could connect it directly to a Raspberry Pi (and eventually plan to do so), to start I want to have it connected to my PC to do some experimenting.

I found this guy had connected the AQ monitor using an Adafruit FT232H board rather than using the typical FTDI cable. The advantage to the FT232H board is it can provide the 5V the AQ monitor will need and yet it will handle the 3.3V data signals.

This post just covers getting this breakout board working on my PC. This should have been quick and easy but of course I had problems …

Once you have the breakout board and the pins soldered, you need to get the FTDI drivers installed. According to the manual, for Windows 10, when you connected the FT232H to the PC, the drivers should simply be downloaded from Microsoft.

This did not work automagically for either of my Win10 systems.

Configuring FT232H Drivers

When I inserted the FT232H, I received this in device manager:

To install the driver I right clicked on ‘Unknown Device’, selected Update Driver and then Search Automatically. The driver was located, but no new device showed up in the Ports section. Notice below, though, that USB serial converter does appear in the USB controllers section:

At this point, I wasted way too much time going down multiple rabbit holes trying to figure out why the COM port was not being assigned.

The trick was found here:

Click on right click the USB Serial Controller and select properties:

Select the advanced tab, then select Load VCP:

Finally, update the driver again and viola, the FT232H shows up as a COM port:

Testing the FT232H Breakout Board

To test, I put the breakout board on a small breadboard and tied the TX/RX lines together (D0/D1):

Next, start up Putty and configure it for serial operation:

Once connected, what you type will be echoed on the screen as the TX is looped back thru the RX line:

Posted in c-electronics | Tagged | 1 Comment

Accessing HP3000 with Telnet and puTTY

Telnet has never worked properly on my physical HP3000. I never needed it so I never fixed it. I decided to to track down the problem this evening.

Fixing the telnet server was fairly easy – in 2001 someone altered the Telnet Security settings to Deny *.*.*.* so no one was going to get in. Which made sense when it was in production. Now, this system runs maybe 4 hours a year and has nothing of value on it, so telnet is acceptable!

Once I had the telnet service running I found that accessing it from a Linux shell worked just fine but for some reason echo was failing when I accessed it with puTTY.

The logical answer was to change Local Echo from Auto to Forced On. Except when you do that, passwords are echoed.

The correct answer is to set Local line Editing to Forced Off:

Now, each character is echoed back from the host so passwords are not shown:

:hello manager.sys


HP3000  Release: C.65.00   User Version: C.65.00   FRI, DEC 18, 2020, 11:10 PM
MPE/iX  HP31900 C.25.06  Copyright Hewlett-Packard 1987.  All rights reserved.



Posted in c-retro | Tagged , | Leave a comment

Replacing HP3000 Series 928LX SCSI Hard Drive with SCSI2SD SD Card interface

My HP3000/928LX uses a Fast SCSI hard drive. Not only are these are no longer available new, they are getting hard to find and expensive used. Not that I start the HP3000 that often, but every time I do I wince, hoping the hard drive will start.

I replaced a failed hard drive about 4 years ago and realized at that time, some day I would no longer be able to boot this system if something didn’t change. I did buy a used spare SCSI drive at the time, only to learn recently it is missing its terminating resisters, so it is probably useless to me.

A few months ago I stumbled across a SCSI to SD adapter card created by Michael McMaster.

Honestly, I didn’t expect this card would work for me but reviewing the docs, I found that version 6 of the card had been used on an HP3000 917SX and 957RX. The 917SX is not that different from my 928LX. Maybe it would work for me?

Searching the internet, I could find some references in 3000NewsWire that the SCSI2SD card had been used by one person, but no examples anywhere of making it run.

I purchased one of the cards from Inertial Computing:

I also needed to order the optional 4 pin molex to Berg cable as the HP3000 only supplies power using the older molex connectors:

These arrived very quickly.

I decided the smart thing to do was mount this card under the tape drive in a 5.25″ slot so it would be possible to access the SD card and USB port without having to disassemble the HP3000. I ordered a 5.25″ to 3.5″ bay adapter kit:

After a lot of experimenting and some advice from Michael McMaster, I settled on this SD card which is WAY WAY larger than the 4GB drive I wished to replace:

It is recommended that you use an A2 class card. These are optimized for random I/O more than the other cards which are optimized for serial writing as a camera would do.


Rather than blindly try to install this into my system as the primary hard drive, I decided to first install the SCSI2SD card along side my SCSI hard drive and configure it as a private volume. This allowed me to experiment without any time-consuming system reloads.

I won’t go over the details of setting the card up to use as a private volume except for these notes:

  • The steps for creating a private volume exist below when I configure LDEV 3 with the private volume named BIGVOL.
  • When configuring the card I needed to disable the terminator as the real hard drive provides SCSI termination.
  • After booting, use ODE / Mapper to verify the drive is being seen. Using this utility can be seen below as well.


Installing the SCSI2SD card as a private volume also gave me a chance see what type of performance I could expect. I am much more interested in providing data storage that will last another 10-15 years than performance, but performance can’t be dismal.

My performance tests involved 10M random reads and writes into an 80MB file. An 80MG file was still small enough the system could cache it entirely so I took those results with a major grain of salt.

I also did large store/restores to the private volume. These weren’t cached but were sequential and not the fairest test of using an SD.

My non-scientific testing shows I/Os taking about 1.1 times longer when  using SD rather than a physical hard drive.

Once the system was fully converted from hard drive to SD, any delays weren’t significant enough for me to actually notice.

Imaging the SD Card

Once really nice feature about using an SD card for the operating system: I can image the SD card to a file on my PC using Win32Imager (linux dd command would probably work as well).

With a backup SD image, I could create a new SD card quickly and have the system fully running again without all of the trouble of a system INSTALL.

Installing the SCSI2D Card

The SCSI2SD card cannot touch the metal drive tray. There are electrical traces on the bottom of the board. Fortunately I keep have several kits of M2 nylon mounting parts on hand like this:

I scavenged enough parts to mount the board above they tray.

Here is the assembled unit:

The assembly was then inserted into the I/O cage below the tape drive:

I also decided to leave the functional hard drive in the I/O cage. If the SCSI2SD card were to fail, I would only need to move cables to have a functioning system again.

Here is the front side of the I/O cage:

Connecting the Cable

When connecting the SCSI2SD card along side the running hard disc, I put it in the middle of the SCSI ribbon cable.

In the back of head it seems you are not supposed to leave the end of a SCSI cable dangling (it becomes an antenna), so when I replaced the hard drive with the SCSI2SD adapter, I connected the end of the cable into the SCSI2SD card:

Configuring the SCSC2SD Card

I had some trouble with this at first because I had the wrong version of the configuration utility. You must use scsi2sd-util6, not scsi2sd-util.

You can find the scsi2sd-util6 utility here:

I have successfully used both the 32 and 64 bit windows version of this utility. The Linux version failed for me (I believe the error had to do with missing the correct version of clib – which surprised me as I was running it on a laptop with a very new version of Linux mint).

Once you run the utility you will see:

At the bottom it will indicating it is searching for the card. Connect the USB cable and it should find it.

As I recall Windows wanted to install a driver, but I did not install one. I believe you only need to install drivers if you decide to update the firmware on the SCSI2SD card.

My SCSI2SD card came with firmware version 6.3.1. The changes in the latest, 6.3.2, were not applicable to me so I skipped the update and saved myself some trouble.

If you do want to install drivers, there is an .MSI install that came in the install directory as well.

To start, I suggest doing file | load defaults just to make sure all settings are at a good starting point.

On the General Settings tab, there are only 2 settings that are applicable to the HP3000:

  • Enable SCSI terminator
  • Enable parity

Parity should always be enabled.

Enabling parity depends on your system. When I was using SCSI2SD with the old hard drive in place, I had this disabled as the hard drive provided it. Once I removed the old drive, I then enabled the SCSI terminator.

Click on the device1 tab:

These are the settings that I used to replace my LDEV 1 hard drive with SCSI2SD being LDEV1.

SCSI ID matches the SCSI ID of the hard drive which is 6. This will be LDEV 1. When I was testing the card I used an unused SCSI ID of 5.

SD card start sector will be 0 if this is the first device on the SD card, which it is.

Sector Size MUST BE 512. I started with 256 which is ‘logically’ how many bytes are in an HP3000 sector, but it turns out that is wrong. Using 256 reduced the amount of space I had available on the SD card. 512 works fine.

Device Size requires a little thought. MPE/iX version 7.5 allows you to fully use hard drives > 4GB. My 928LX wasn’t supported under MPE/iX 7.5, so I’m running version 6.5 which does NOT allow LDEV 1 to be larger than 4GB. So I selected 4GB for device 1 and will have a much larger device 2.

Vendor and Product ID the default settings have leading spaces such as ”  codesrc”. These leading spaces cause problems for the HP3000 so remove them.

I changed the Product ID from SCSI2SD to SCSI2SDn where n matches each LDEV I’m setting up. That will make it easier to verify in the O/S I have the proper SCSI ID assigned to the LDEV I’m configuring.

Device2 Tab:

Auto: When you first go into this tab there will be an error about overlapping disc space. Click on the Auto box and the SD card start sector will be automatically set to follow the prior device’s partition.

Device Size: Since I have oodles of space on this SD card and LDEV 1 is limited to 4GB, I setup LDEV 2 to be 32GB. MPE i/X 6.5 can access up to 128G as I recall. I don’t use all remaining space because I want a big private volume I will use for system backups.

Device 3 tab:

LDEV 3 will be my private volume and I assign most of the remaining space to it. I left some unused just in case there are some issues with Win32Imager hitting the end of media too soon.

There is some weird problem I’ve seen occasionally when copying one SD card to another. Even though they are both the same size (e.g. a 64GB SD card), one may have a few less sectors than the other.

At this point I do a File | Save to Device to write the configuration to the SCSI2SD device. I also do a File | Save to File to create an XML file of the settings.

You are now ready to configure the SCSI2SD adapter in MPE/iX.

Create the INSTALL Tape

Most of my time spent in HP3000 Operations were during the reign of MPE/III, IV, and V. 40 years later I can do  a SYSDUMP/RELOAD in my sleep. By the time we got MPE i/X systems, I rarely did operations, so everytime I have to do a SYSGEN/INSTALL I have to find documentation on exactly what needs to be done.

First Verify the Running Config is Clean

F1-Open Config
F3-Go to Validate
F2-Validate DTS/Link
return to exit validate screen
F8 repeatedly until program is exited.

SYSGEN version E.03.01 : catalog version E.03.01    MON, DEC  7, 2020, 10:18 AM
Copyright 1987 Hewlett-Packard Co. All Rights Reserved.

        **note** Retrieving NMMGR configuration data...

        ** First level command **

        io                log (lo)       misc (mi)        spu (sp)
        sysfile (sy)

        basegroup (ba)    keep(ke)       permyes (pe)     show (sh)
        tape (ta)

        clear (cl)(c)     exit (ex)(e)   help (he)(h)     oclose (oc)
 sysgen> io

        ** IO configurator commands **

        aclass (ac)      adev (ad)       apath (ap)      avol (av)
        dclass (dc)      ddev (dd)       dpath (dp)      dvol (dv)
        lclass (lc)      ldev (ld)       lpath (lp)      lvol (lv)
        maddress(ma)     mclass (mc)     mdev (md)       mpath (mp)
        mvol (mv)        hautil (ha)

        clear (cl)(c)    exit (ex)(e)    help (he)(h)    hold (ho)
        oclose (oc)      redo
     io> hold
     io> exit
 sysgen> keep
        keeping to group CONFIG.SYS
        Purge old configuration (yes/no)?y
        ** configuration files successfully saved **
 sysgen> exit

The above process should produce no errors. If it does, correct them before proceeding.

Configure the Disc Drive(s)

LDEV 1 for me is a SCSI drive. I have found that the existing configuration for the physical drive works just fine for the SCSI2SD card. Here is what  I already have for LDEV 1

SYSGEN version E.03.01 : catalog version E.03.01    MON, DEC 14, 2020,  9:26 AM
Copyright 1987 Hewlett-Packard Co. All Rights Reserved.

 sysgen> io

     io> ld 1
 LDEV:     1  DEVNAME:                    OUTDEV:        0   MODE:             
   ID: ST34573N                           RSIZE:       128   DEVTYPE: DISC     
 PATH: 56/52.6.0                          MPETYPE:       4   MPESUBTYPE:  2    
CLASS: DISC     SPOOL                                                          

Because LDEV 1 is already a SCSI drive, I didn’t have to modify the configuration of LDEV 1 to replace the hard drive with the SCSI2SD card.

Because non-system SCSI drives don’t have to be present for the system to boot, I suggest creating the configuration for any additional drives before building the Install tape. Then you know the I/O config is fine before doing the INSTALL.

I needed to configure in LDEV 2 and LDEV 3, so in sysgen I did the following:

io> ad 2 56/52.5.0 id=ST34573N class=disc,spool
  • The .5 in 52.5.0 is the SCSI ID of device 2 of the SCSI2SD card.
io> ad 3 56/52.4.0 id=ST34573N class=pvol
  • Class for the private column can’t be DISC so I use pvol for private volume.

Here is the ‘ld’ of my disc drives at this point:

  LDEV:     1  DEVNAME:                    OUTDEV:        0   MODE:             
   ID: ST34573N                           RSIZE:       128   DEVTYPE: DISC     
 PATH: 56/52.6.0                          MPETYPE:       4   MPESUBTYPE:  2    
CLASS: DISC     SPOOL                                                          
 LDEV:     2  DEVNAME:                    OUTDEV:        0   MODE:             
   ID: ST34573N                           RSIZE:       128   DEVTYPE: DISC     
 PATH: 56/52.5.0                          MPETYPE:       4   MPESUBTYPE:  2    
CLASS: DISC     SPOOL                                                          
 LDEV:     3  DEVNAME:                    OUTDEV:        0   MODE:             
   ID: ST34573N                           RSIZE:       128   DEVTYPE: DISC     
 PATH: 56/52.4.0                          MPETYPE:       4   MPESUBTYPE:  2    
CLASS: PVOL                                                                                                                                                   

Hold / Exit / Keep / Exit and reboot the system to verify.

Build the INSTALL Tape

I logged in as manager.sys:

:hello manager.sys,tmp

First, I did a dump of the entire system. I needed to create sysdump.tmp with contents:

:type sysdump.tmp


then do the sysgen:

:x loadtape 7
>ta dest=offline store=^sysdump.tmp.sys

Verify the tape was built correctly:

:x loadtape 7
:vstore *t;@.@.@;directory;progress

:x loadtape 7
:run checkslt.mpexl.telesup

I selected option 1 for checkslt, and checked it for errors.

Now I was ready to shutdown the operating system and power off the system.

Final Setup of SCSI2SD Card

If you had been testing the SCSI2SD card along side the original hard drive, now it the time to disconnect it.

The SCSI2SD card should now be physically installed at the end of the SCSI cable as shown in the Connecting the Cable section above.

Also, if you were testing the SCSI2SD card and the terminator box is unchecked, you probably need to check it now as the old hard drive most likely was terminated.

INSTALL the System on to the SD Card

I powered the system on. When it got to the main menu, I entered the search command:

Path P0 is the tape drive, so I booted from that path:

At this point I used the commands ODE then MAPPER, run RUN to display the hardware configuration (not shown). It does take quite a while for ODE to load from the tape drive.

This results in this display which shows SCSI IDs 6, 5, and 4 correctly assigned to the SCSI2SD devices 1, 2, and 3:

I then exited ODE/MAPPER, and typed INSTALL to begin the O/S installation.

At this point, INSTALL began copying system files from tape to LDEV 1:

Once all of the files were restored, and the system automatically rebooted. The reboot took a lot longer than I would have expected and there was no disc I/O but it did finally reboot:

Once the boot completed, I typed a PATH command to verify the disc/tape paths were correct. I then booted the system:


The system booted as normal although some files were missing and various subsystems did not start since much of the O/S is still missing. The marked errors are normal the first time you boot with an undefined disk.

I logged in as manager.sys and did a :dstat to verify all of the volumes exist.

In this example, LDEV 2 and 3 are listed as LONER and MASTER because I was experimenting with the SD card before doing this final install. On a fresh SD card you would expect the STATUS to be UNKNOWN.

As you can see, LDEV 1 was defined properly, but LDEVs 2 & 3 still need to be configured in VOLUTIL.

Next, I ran volutil and formatted drives 2 & 3. Normally SCRATCHVOL and FORMATVOL will suffice. Because I already had a private volume on LDEV 3, I also need to use VSCLOSE to close it can be scratched and formatted.

volutil: scratchvol 2
volutil: formatvol 2
volutil: vsclose bigvol
volutil: scratchvol 3
volutil: formatvol 3

Then I added LDEV 2 to the system volume set as Member2:

volutil: newvol MPEXL_SYSTEM_VOLUME_SET:MEMBER2 2 100 100
volutil: :dstat all

Finally, I created volume set BIGVOL on LDEV 3.

volutil: newset BIGVOL master=MEMBER1 ldev=3 perm=100
volutil: :dstat all

I rebooted the system again and then did a :showdev disc then :showdev spool to make sure they looked good:

Here is the result of a discfree c on the empty system:

To restore the rest of the files, (logged in as manager.sys), I typed:

:startspool lp       <<DO NO FORGET THIS!>>
:file t;dev=tape
:restore *t;@.@.@;keep;directory;olddate;show=offline

It is normal for files to fail to restore if they already exist. I examined the output spool file for any other possible error before continuing.

Once the files were restored, I rebooted the system one last time. Verified it came up correctly and all subsystems and standard jobs are running.

Here is a discfree done after the restore:

Converting sectors to bytes, the drives are the appropriate size:

LDEV       Sect                Bytes
   1    16,777,200     4,294,963,200                
   2   134,217,712    34,359,734,272
   3   335,544,304    85,899,341,824
 ALL   486,539,216   124,554,039,296

The system has been running for about 5 days as of this writing w/o any glitches. I’ve moved big file sets around, done backups to the private volume, and just let the system sit running.

Performance-wise, I can’t tell any difference. It is a little strange not to hear that old SCSI drive click on and make noise while it is in use.


Posted in c-retro | Tagged , , | Leave a comment

Repairing a female Molex .156 Found on Pinball Games

I recently purchased the pinball game Funhouse, my all time favorite. The machine was refurbished and looks brand new inside and out.

Shortly after receiving it I noticed that coin door functions were working erratically. Most importantly, the buttons used to program the unit are on the back of the coin door were misbehaving.

Researching the Problem

The coin door wiring harness comes into the game and connects to the coin door interface board:

Pulling the connector off the door I could test for continuity, but clearly just looking into the connector some of the pins inside looked different than the others. Turned out they were broken and might or might not make an electrical connection.

Using a tiny screw driver, I was able to press against the metal tab (circled) of the pin to allow it to be extracted:

A good pin should look like this:

Several of them looked more like this, the spring part had snapped off:

Clearly, this connector needed to be rebuilt.

Obtaining the Parts

There are several pinball specialty companies out there that sell these parts. I ended up purchasing parts from Marcos Specialties and Pinball Life. It was from these sites that I learned these are Molex .156 connectors. You can also find these parts (except the key post) from  Mouser, Digikey, and Jameco.

From Pinball Life I purchased:


Besides the small screw driver and a wire cutter, I needed a crimper plus I went with a fancy wire stripper that allows me to consistently strip insulation at the same length. I don’t want to cut any more wire than absolutely necessary from this old pinball game.

Building the New Cable End

I’m not going to spend time going over how to build the cable ends. You can find a general overview of the procedure here:

How to re-pin Molex connectors

A procedure for crimping the terminals to the wires can be found here:


Some Additional Notes

I wanted to make sure my new cable end was keyed so I could not connect it upside down. I guarantee some day I will try to. That is the function of the polarizing key. I inserted it into the same terminal hole as it was on the hold connector:

With the polarizing key installed, it was then just a matter of extracting the terminal from the old connector, cutting the old terminal off, stripping, crimping the new one on, and inserting the terminal into the PROPER hole in the new connector.

Here is the completed connector. Note that there is one hole without a terminal.

The connector is then connected back onto the coin door interface board and tested. Success!

After completing the project, I decided to see if my old IWISS Dupont pin crimper would work. It does, and it will make both crimps at the same time.


Posted in c-electronics, c-retro | Tagged , , , | Leave a comment

GnuCOBOL’s Report Writer Module

I am finally to my goal made at the beginning of this (lousy) year! Use embedded SQL to extract data from a database and the COBOL report writer to produce a printed report. Thus, this is the last planned post for GnuCOBOL.

In 1979 when I started my career, I was a programmer for (what is now) Texas Statue University’s Administrative Data Processing department. Paper reports were big and even bigger at the university where few humans in the Admin dept had access to a terminal.

A year later I went on to a much better paid position (the university could get by with paying less than minimum wage) where I used COBOL on a Hewlett-Packard HP3000 which was my machine of choice for the rest of my professional programming career. UNFORTUNATELY, HP decided not to implement the Report Writer module (the Report Writer was an optional module in the COBOL standard).

I can remember banging my head on the wall because it was extremely boring to have to manage the details of writing a report by hand. As time went by I eventually completely forgot how the Report Writer even works, but I never forgot I would prefer to have it!

To write the test program for this post, I had to relearn the report writer. It really isn’t too difficult to do so. But you do have to learn quite a few things simultaneously to make it work.

Note, I had originally intended to also use SORT INPUT/OUTPUT PROCEDURES to sort the data as done in the prior post. My initial version of the program did that, but there was a lot more code than I wanted for an example. Given one most likely wouldn’t use COBOL SORT when extracting data from a database, I decided to forgo the COBOL SORT.


The GnuCOBOL manual (3.1) has an entire section (section 9) on how to use the Report Writer. I also found this tutorial quite useful.

COBOL Report Writer Feature

It is written for IBM MVT COBOL, but there are few differences between that and GnuCOBOL.

The GnuCOBOL FAQ also has a section on the report writer, with an example derived from the above tutorial.

Designing The Report

As with the prior example, I want to design a DVD rental history report but now that I’m using the report writer, it will have headings, counts, and footings. A real report.

As I was working on this projects, I thought back to the president of the company I worked at long ago. He designed ALL reports and we programmers implemented them. The reports were the face of our company and he wanted them to look good. Indeed they did, his reports were probably the best I’ve seen. Especially these days when many reports are just an after thought.

When he gave us a report to implement it was on an official IBM Report Layout form like this (found at

Totally off topic, but notice the Carriage Control Tape column on the far left. When using old printers like the IBM 1403, you could advance to a particular line by if there was a punch in that tape. Channel 1 of the tape was always top of the form and typically we used used a tape with only that channel punched. But if you need to print a report that was largely empty space, like may be a check or a utility bill, you could very quickly slew to the line you needed by advancing to the appropriate channel. For example:


That old 1403 printer could print amazingly fast for something so large. And standing next to it was about like standing next to a gattling gun (or so it seems to me now).

Popping the stack back to my original train of thought: My input data is this data base:

I’ll use this query to extract the data:

    to_char(rental.return_date,'yyyymmdd') as 
from customer
inner join rental    on 
    customer.customer_id = rental.customer_id
left  join inventory on 
    rental.inventory_id  = inventory.inventory_id
left  join film      on 
    inventory.film_id    = film.film_id
order by customer.last_name, customer.first_name, 

and will produce a report that looks like this:

11/12/2020                                               PAGE:  1
---------------------CUSTOMER HISTORY REPORT---------------------

------------NAME------------ CUST                      ---DATE---
-----LAST------ ---FIRST---- -ID- -----DVD TITLE------ -RETURNED-

I will want to report the total DVDs each customer has rented, the total number of DVDs rented, and the total number of customers reported.

A Simpler Report Writer Program First

It took me some time to get my head around the operation of the report writer.It’s not hard, just different. nearly everything is specified in the DATA DIVISION not the PROCEDURE DIVISION.

I started by writing a program that didn’t do report control breaks. Omitting the control footings makes the reporting easier to understand.

The program source can be found at

Here is notable parts of the code with comments:

Below is the output file that will contain the report. LINE SEQUENTIAL indicates when each line is written it should be terminated with the appropriate line terminator for the operating system being used.

         ASSIGN TO               "./reportWriter01.lst",
     REPORT IS                   RF-REPORT.

These fields are used to format dates:



 01  DB-REC.
     03  DB-CUSTID               PIC 9(9).
     03  DB-LASTNAME             PIC X(45).
     03  DB-FIRSTNAME            PIC X(45).
     03  DB-FILMTITLE            PIC X(45).
     03  DB-RETURNDATE           PIC 99999999.


     03  TF-DATE-IN.
         05  TF-YY               PIC 9999.
         05  TF-MM               PIC 99.
         05  TF-DD               PIC 99.
     03  TF-DATE-OUT             PIC X(10).    
     03  TF-RUNDATE-IN.
         05  TF-RUNDATE-YY       PIC 9999.
         05  TF-RUNDATE-MM       PIC 99.
         05  TF-RUNDATE-DD       PIC 99.
     03  TF-RUNDATE-OUT          PIC X(10).    

At the end of the DATA DIVISION is the REPORT SECTION which will describe (usually) everything needed to produce the report.

  • Page Limit: Number of lines per page.
  • Heading: line upon which the first header line is printed.
  • First Detail: line upon which the first detail line is printed.
  • Last detail: line upon which the last detail line of the page can be printed.

    PAGE LIMIT                  66 LINES,
    HEADING                     1,
    FIRST DETAIL                7,
    LAST DETAIL                 60.

This next section defines the header lines to be printed. The first line is on the absolute position of LINE 1, then each line after is placed on the next physical line (LINE PLUS 1) or 2 lines down (LINE PLUS 2).

Within each line are column definitions. For this report I’m specifying exact column placement. SOURCE indicated this column will contain the specified field using the specified PIC. So the report’s run date starts in column 1, comes from TF-RUNDATE-OUT and will take 20 characters.

The constant “PAGE” is placed at column 58. Note that you do not have to use PIC for values. The next example will show a more concise page definition.

    03  LINE 1.
        05  COLUMN 1            PIC X(20),
                SOURCE TF-RUNDATE-OUT.
        05  COLUMN 58           PIC X(6),
                VALUE "PAGE: ".
        05  COLUMN 64           PIC Z9,
                SOURCE PAGE-COUNTER.
    03  LINE PLUS 1.
        05  COLUMN 1            PIC X(21),
                VALUE ALL "-".
        05  COLUMN 22            PIC X(23),
        05  COLUMN 45           PIC X(21),
                VALUE ALL "-".
    03  LINE PLUS 2.
        05  COLUMN 1            PIC X(28),
                VALUE "------------NAME------------".
        05  COLUMN 30           PIC XXXX,
                VALUE "CUST".
        05  COLUMN 56           PIC X(10),
                VALUE "---DATE---".
    03  LINE PLUS 1.
        05  COLUMN 1            PIC X(15),
                VALUE "-----LAST------".
        05  COLUMN 17           PIC X(12),
                VALUE "---FIRST----".
        05  COLUMN 30           PIC XXXX,
                VALUE "-ID-".
        05  COLUMN 35           PIC X(20),
                VALUE "-----FILM TITLE-----".
        05  COLUMN 56           PIC X(10),
                VALUE "-RETURNED-".

The detail line is laid out in the same manner. Each field is SOURCEd from the database record (except the date).

    03  LINE PLUS 1.
        05  COLUMN 1            PIC X(15),
                SOURCE DB-LASTNAME.
        05  COLUMN 17           PIC X(12),
                SOURCE DB-FIRSTNAME.
        05  COLUMN 30           PIC ZZZ9,
                SOURCE DB-CUSTID.
        05  COLUMN 35           PIC X(20),
                SOURCE DB-FILMTITLE.
        05  COLUMN 56           PIC X(10),
                SOURCE TF-DATE-OUT.

In the procedure division, the RUN DATE is derived from the system date:

        TF-RUNDATE-YY               INTO TF-RUNDATE-OUT.

After the cursor is setup, we are ready to begin reading records and printing them. DON’T forget to open the report file (I did at first. No error is generated, but I couldn’t find any output).

The INITIATE verb initiates the report.


This is the “heart” of printing. Each line is read from the database, and we simply GENERATE-DETAIL line to print the report – it handles all of the details of printing for us.

        EXEC SQL 
            FETCH C1 INTO 

We’ve read all of the data, so terminate the report, and close it.


As you can see ALL of the work of generating the report is setting up the REPORT SECTION. Even without the report writer you still have to define how the output will appear, so there isn’t much extra necessary to use the report writer.

Compile and run:

$export COBCPY=~/Open-COBOL-ESQL-1.2/copy
$export COB_LDFLAGS=-Wl,--no-as-needed
$ocesql reportWriter01.cob reportWriter01.tmp
precompile start: reportWriter01.cob
              LIST OF CALLED DB Library API            
$cobc  -locesql -x reportWriter01.tmp

Excerpts from the output file:

less reportWriter01.lst 

11/11/2020                                               PAGE:  1
---------------------CUSTOMER HISTORY REPORT---------------------

------------NAME------------ CUST                      ---DATE---
-----LAST------ ---FIRST---- -ID- -----FILM TITLE----- -RETURNED-

Abney           Rafael        505 Sagebrush Clueless   05/29/2005
Abney           Rafael        505 Pocus Pulp           06/05/2005
Abney           Rafael        505 Legally Secretary    06/19/2005
Abney           Rafael        505 Nightmare Chill      06/20/2005
Abney           Rafael        505 Trading Pinocchio    06/28/2005
Abney           Rafael        505 Coneheads Smoochy    06/28/2005
Abney           Rafael        505 Wanda Chamber        07/12/2005
Abney           Rafael        505 Madness Attacks      07/14/2005
Abney           Rafael        505 Conquerer Nuts       07/14/2005


Adams           Kathleen       36 Go Purple            06/20/2005
Adams           Kathleen       36 Betrayed Rear        07/10/2005
Adams           Kathleen       36 Room Roman           07/11/2005

11/11/2020                                               PAGE:  2
---------------------CUSTOMER HISTORY REPORT---------------------

------------NAME------------ CUST                      ---DATE---
-----LAST------ ---FIRST---- -ID- -----FILM TITLE----- -RETURNED-

Adams           Kathleen       36 Boogie Amelie        07/12/2005
Adams           Kathleen       36 Swarm Gold           07/12/2005
Adams           Kathleen       36 Amadeus Holy         07/16/2005


11/11/2020                                               PAGE: 98
---------------------CUSTOMER HISTORY REPORT---------------------

------------NAME------------ CUST                      ---DATE---
-----LAST------ ---FIRST---- -ID- -----FILM TITLE----- -RETURNED-

Young           Cynthia        28 Ice Crossing         08/23/2005
Young           Cynthia        28 Saddle Antitrust     08/24/2005
Young           Cynthia        28 Lebowski Soldiers    08/27/2005
Young           Cynthia        28 Loverboy Attacks     08/27/2005
Young           Cynthia        28 Attacks Hate         08/28/2005
Young           Cynthia        28 Suspects Quills      00/00/0000

The Final Version of the Program

The above report handles the header and detail lines great. Now I want to add in control breaks to report the number of DVDs each customer has rented, and at the end the total number of DVDs rented and the total number of customers reported.

Unfortunately there was no dollar amounts in this data upon which to report. The report writer can handle totaling detail line amounts with almost no more work than the above report by just using the SUM clause.

Instead, I want to count records which will add just a slight bit more complexity.

The source to this program can be found at

Here are notable parts of the code with comments:




I’m going to need CS-1, a constant of ONE used to add each DVD detail printed. I’m also going to need a counter to track the total number of customers reported.

     03  CS-1                    PIC S9(4), COMP     VALUE 1.
     03  CT-CUSTS                PIC S9(9), COMP     VALUE ZERO.


I made a couple of changes for the database record read. I added DB-CUSTNAME. I need to know if either DB-LASTNAME or DB-FIRSTNAME changes, so I grouped them into DB-CUSTNAME.

I also altered how I handle DB-RETURNDATE. I want to STRING month, day, year together but OCESQL requires that the field containing the date from the database be an elementary item. To more cleanly handle this, I REDEFINE DB-RETURNDATE which allows access to the individual fields.

 01  DB-REC.
     03  DB-CUSTID               PIC 9(9).
     03  DB-CUSTNAME.
         05  DB-LASTNAME         PIC X(45).
         05  DB-FIRSTNAME        PIC X(45).
     03  DB-DVDTITLE             PIC X(45).
     03  DB-RETURNDATE           PIC 99999999.
         05  DB-YYYY             PIC 9999.
         05  DB-MM               PIC 99.
         05  DB-DD               PIC 99.

* ---------------------------------------------------------------

In the RD, I now have the controls FINAL and DB-CUSTNAME. Every time DB-CUSTNAME changes a control break occurs. Also at the end of the report (FINAL) a control break occurs.

  RD  RF-REPORT,                  
     CONTROLS ARE                FINAL, DB-CUSTNAME,     
     PAGE LIMIT                  60 LINES,
     HEADING                     1,
     FIRST DETAIL                7,
     LAST DETAIL                 60.

I made a slight change to the header. Line 1 now contains a form feed character which will allow it to print on pretty much any modern printer.

     03  LINE 1.                 *> *** FORMFEED
         05  COLUMN 1            VALUE X'0C'.
     03  LINE PLUS 1.
         05  COLUMN 1            PIC X(20),
         05  COLUMN 58           VALUE "PAGE: ".
         05  COLUMN 64           PIC Z9,

In this report, I drop absolute column positions (except COLUMN 1) and use relative (PLUS n). In this next section, each field is adjacent to the next so I use PLUS 1. Typically I would want a space between columns and in the detail line you will see everything set at PLUS 2.

Also note that the PIC clause is now omitted as well. If omitted, the compiler derives the length from the VALUE clause.

     03  LINE PLUS 1.
         05  COLUMN 1            PIC X(21),
             VALUE ALL "-".
         05  COLUMN PLUS 1       PIC X(21),
             VALUE ALL "-".
     03  LINE PLUS 2.
         05  COLUMN 1            VALUE "------------".
         05  COLUMN PLUS 1       VALUE "NAME------------".
         05  COLUMN PLUS 2       VALUE "CUST".
         05  COLUMN 56           VALUE "---DATE---".
     03  LINE PLUS 1.
         05  COLUMN 1            VALUE "-----LAST------".
         05  COLUMN PLUS 2       VALUE "---FIRST----".
         05  COLUMN PLUS 2       VALUE "-ID-".
         05  COLUMN PLUS 2       VALUE "-----DVD TITLE------".
         05  COLUMN PLUS 2       VALUE "-RETURNED-".

The DETAIL-LINE is very nearly like the last report. Each column contains the appropriate PIC clause to format the data, a SOURCE clause indicating where to obtain the data, and a relative column position.

Note the use of GROUP INDICATE. This clause causes the associated field to be omitted after the first time it is printed on each page. This makes the report much easier to read and saves some ink as well.

     03  LINE PLUS 1.
         05  COLUMN 1            PIC X(15),
             SOURCE DB-LASTNAME,
             GROUP INDICATE.                 *> PRINTS ONLY ONCE
         05  COLUMN PLUS 2       PIC X(12),
             GROUP INDICATE.
         05  COLUMN PLUS 2       PIC ZZZ9,
             SOURCE DB-CUSTID,
             GROUP INDICATE.
         05  COLUMN PLUS 2       PIC X(20),
             SOURCE DB-DVDTITLE.
         05  COLUMN PLUS 2       PIC X(10),
             SOURCE TF-DATE-OUT.

This is the footing group that will print at the end of each customer. In consists simply of a label and the number of DVDs rented.

The DVD count is obtained by using SUM CS-1. This will add 1 to an internal counter for each detail line printed for the customer. Had the database contained an amount field, say DB-AMOUNT, you could use SUM DB-AMOUNT and get the total amount for all records.

     03  LINE PLUS 1.
         05  COLUMN 35           VALUE "---CUSTOMER RENTALS:".
         05  COLUMN 61           PIC Z,ZZ9,
             SUM CS-1.           *> *** ADDING 1 PER RECORD

Here is the report totals print group (FINAL FOOTING).

The total DVDs rented is obtains in the same manner as above, by SUMming CS-1.

The customer count has to manually be calculated.

     03  LINE PLUS 2.
         05  COLUMN 35           VALUE "------TOTAL RENTALS:".
         05  COLUMN 59           PIC ZZZ,ZZ9,
             SUM CS-1.
     03  LINE PLUS 2.
         05  COLUMN 35           VALUE "----TOTAL CUSTOMERS:".
         05  COLUMN 59           PIC ZZZ,ZZ9,
             SOURCE CT-CUSTS.

* ---------------------------------------------------------------

Here is how the customer count is calculated, in the DECLARATIVES.

This bit of code is executed before each CUST-TOTAL report footing (e.g. the end of each customer).

It simply adds 1 to CT-CUSTS to maintain a running count of customers encountered during the report print.

     ADD 1                       TO CT-CUSTS.


The report is generated in the same manner (with a slight change in how I used STRING to generate the date).


         STRING DB-MM, "/", DB-DD, "/", DB-YYYY
             INTO                TF-DATE-OUT;
         EXEC SQL 
             FETCH C1 INTO 



To compile and run the final report:

$export COBCPY=~/Open-COBOL-ESQL-1.2/copy
$export COB_LDFLAGS=-Wl,--no-as-needed
$ocesql reportWriter02.cob reportWriter02.tmp
precompile start: reportWriter02.cob
              LIST OF CALLED DB Library API            
$cobc  -locesql -x reportWriter02.tmp

Excerpts from the output file:

11/12/2020                                               PAGE:  1
---------------------CUSTOMER HISTORY REPORT---------------------

------------NAME------------ CUST                      ---DATE---
-----LAST------ ---FIRST---- -ID- -----DVD TITLE------ -RETURNED-
Abney           Rafael        505 Sagebrush Clueless   05/29/2005
                                  Pocus Pulp           06/05/2005
                                  Legally Secretary    06/19/2005
                                  Nightmare Chill      06/20/2005
                                  Trading Pinocchio    06/28/2005
                                  Coneheads Smoochy    06/28/2005
                                  Wanda Chamber        07/12/2005
                                  Madness Attacks      07/14/2005
                                  Conquerer Nuts       07/14/2005
                                  Double Wrath         07/16/2005
                                  Goodfellas Salute    07/20/2005
                                  Hobbit Alien         08/05/2005
                                  Shock Cabin          08/06/2005
                                  Karate Moon          08/08/2005
                                  Juggler Hardly       08/10/2005
                                  Strictly Scarface    08/20/2005
                                  Blackout Private     08/23/2005

                                  Freddy Storm         08/28/2005
                                  Chocolat Harry       08/28/2005
                                  Clash Freddy         08/28/2005
                                  Conversation Downhil 00/00/0000
                                  ---CUSTOMER RENTALS:         21

Adam            Nathaniel     504 Kiss Glory           05/31/2005
                                  Gathering Calendar   06/04/2005
                                  Noon Papi            06/06/2005
                                  Guys Falcon          06/26/2005
                                  Shepherd Midsummer   06/27/2005
                                  Ending Crowds        07/12/2005
                                  Hanging Deep         07/13/2005
                                  Chasing Fight        07/15/2005
                                  Something Duck       07/15/2005
                                  Nemo Campus          07/18/2005
                                  Poseidon Forever     07/30/2005
                                  Divorce Shining      07/30/2005
                                  Jason Trap           08/01/2005
                                  Sleuth Orient        08/02/2005
                                  Tramp Others         08/03/2005
                                  Tights Dawn          08/04/2005
                                  Rocky War            08/07/2005
                                  Amadeus Holy         08/10/2005
                                  Lust Lock            08/21/2005
                                  Wardrobe Phantom     08/22/2005
                                  Menagerie Rushmore   08/24/2005
                                  Analyze Hoosiers     08/24/2005
                                  Dancing Fever        08/25/2005
                                  Boogie Amelie        08/25/2005
                                  Orient Closer        08/28/2005
                                  War Notting          08/28/2005
                                  Freddy Storm         08/30/2005
                                  Strangers Graffiti   08/31/2005
                                  ---CUSTOMER RENTALS:         28

11/12/2020                                               PAGE:  2
---------------------CUSTOMER HISTORY REPORT---------------------

------------NAME------------ CUST                      ---DATE---
-----LAST------ ---FIRST---- -ID- -----DVD TITLE------ -RETURNED-
Adams           Kathleen       36 Orange Grapes        05/28/2005
                                  Alone Trip           06/01/2005
                                  Go Purple            06/20/2005
                                  Betrayed Rear        07/10/2005
                                  Room Roman           07/11/2005
                                  Boogie Amelie        07/12/2005
                                  Swarm Gold           07/12/2005
                                  Amadeus Holy         07/16/2005
                                  Sling Luke           07/30/2005
                                  Pianist Outfield     08/01/2005
                                  Seabiscuit Punk      08/01/2005
                                  Women Dorado         08/02/2005
                                  Wash Heavenly        08/02/2005
                                  Treatment Jekyll     08/03/2005


11/12/2020                                               PAGE: 40
---------------------CUSTOMER HISTORY REPORT---------------------

------------NAME------------ CUST                      ---DATE---
-----LAST------ ---FIRST---- -ID- -----DVD TITLE------ -RETURNED-
Young           Cynthia        28 Ship Wonderland      05/31/2005
                                  Star Operation       06/17/2005
                                  Dying Maker          06/18/2005
                                  Banger Pinocchio     06/23/2005
                                  Odds Boogie          06/25/2005
                                  Virginian Pluto      06/26/2005
                                  Wolves Desire        07/09/2005
                                  Kick Savannah        07/10/2005
                                  Deceiver Betrayed    07/12/2005
                                  Dalmations Sweden    07/16/2005
                                  Murder Antitrust     07/16/2005
                                  Papi Necklace        07/18/2005
                                  Spirit Flintstones   07/18/2005
                                  Trading Pinocchio    08/01/2005
                                  Wars Pluto           08/02/2005
                                  Lawless Vision       08/03/2005
                                  Clueless Bucket      08/03/2005
                                  Birch Antitrust      08/05/2005
                                  Easy Gladiator       08/05/2005
                                  License Weekend      08/05/2005
                                  Fiction Christmas    08/08/2005
                                  Candidate Perdition  08/09/2005
                                  Translation Summer   08/19/2005
                                  Minds Truman         08/21/2005
                                  Beverly Outlaw       08/21/2005
                                  Ice Crossing         08/23/2005
                                  Saddle Antitrust     08/24/2005
                                  Lebowski Soldiers    08/27/2005
                                  Loverboy Attacks     08/27/2005
                                  Attacks Hate         08/28/2005
                                  Suspects Quills      00/00/0000
                                  ---CUSTOMER RENTALS:         32

                                  ------TOTAL RENTALS:     16,044

                                  ----TOTAL CUSTOMERS:        599

Being the paranoid programmer I am, all totals were compared with the database and they match!

This concludes my foray into GnuCOBOL!

Posted in c-gnuCOBOL | Tagged , | 1 Comment