Search This Blog

De Omnibus Dubitandum - Lux Veritas

Wednesday, March 20, 2019

Keeping Track Of Toxic Exposure

January 28, 2019 By Michael D. Shaw @ HealthNewsDigest.com       
| | Comments (0)      
 
It was William Thomson, 1st Baron Kelvin, aka Lord Kelvin who said, “When you can measure what you are speaking about, and express it in numbers, you know something about it, when you cannot express it in numbers, your knowledge is of a meager and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely, in your thoughts advanced to the stage of science.”
Few would argue with his contention, and these days we have in place countless measuring devices deployed in every conceivable environment. Ever since I was a student, it was understood that temperature was the most measured variable. But, one wonders…is that still true? How many millions of people frequently test their blood glucose; how many millions of websites measure their traffic every day?

In the world of public health, starting around 100 years ago, and then being energized by the creation of agencies such as EPA and OSHA, monitoring of exposure to air contaminants occurs now in virtually every country. In the US, OSHA promulgates Permissible Exposure Limits for more than 500 substances. Additional standards are published by other agencies and non-governmental organizations.

Per 29 CFR 1910.1000(a)(2), these PELs–for the most part–are based on an 8-hour time-weighted average of the worker’s exposure to the substance. Thus, not only must the concentrations be measured, but the data must be archived and processed such that the appropriate average is calculated.

But here’s the problem. The PELs date back to the early days of OSHA, founded in 1971. Even then, measuring instruments did exist for some of the specified substances. And, virtually all of them were provided with an output that–in theory–could be captured by some sort of recording device. Microcomputers were still a decade away. Expensive data loggers would be introduced in the early 1970s, but did not see widespread usage in the occupational health market.

As a consequence, before the advent of microprocessors, most 8-hour averages were based on wet chemical analytical methods, whereby an air sample is collected over a period of time, and this “single number” concentration–thus determined–became the average. However, no time history was available, and even though this average might be below the PEL, there may have been episodes of high exposure, that were literally averaged out. For many substances, in addition to the PEL, there are also limits on short-term exposure.

Before microcomputers, if an occupancy had continuous gas detection instruments installed, which provided instantaneous readings–along with user-adjustable alarms–OSHA did not press too hard on the 8-hour average matter. Besides, many companies also had wet chemical analysis reports on file for the analytes in question.

In 1984, Interscan introduced the DOS version of its Arc-Max® package–one of the first data acquisition products intended to support PEL compliance. The first customers were in the healthcare field, mostly involving applications for monitoring ethylene oxide, a sterilant.

There have been many updates to Arc-Max since then, and it is still a popular product, providing data acquisition, archiving, and reporting to a long list of end-users.

For many years, software packages such as Arc-Max have traditionally been deployed on local or networked computers, with true Cloud data logging being the rare exception. The advantage of a Web-based program is that users can “watch their sensors on their phone,” or any other compatible device. While such packages exist, they are usually costly and proprietary, in that they are not easily open to any sensor.

Interscan is now alpha testing a Web-based package that will accommodate any sensor with a standard output, and will support most communications protocols. The beta testing will involve end-users in the healthcare and pharmaceutical industries. Sensor data is ported to a platform hosted at Amazon Web Services, with various levels of access available. Averaging and trending are calculated in the Cloud, with the report dashboard available at a Web portal.
More information on this new package will be released as available.

No comments:

Post a Comment