Water Quality Monitoring

Water is a critical component of a wide variety of industrial processes. It can be used as an ingredient in a final product, as a cleaning agent to prepare equipment or materials for use or reuse, and as a reagent for analytical purposes. Because it is used in so many ways, it is important to consider water quality in terms of the presence of microorganisms. While it is tempting to presume that water is not subject to microbial contamination in the same way as more nutrient-rich or hospitable substrates are, this is not the case: in fact, microorganisms can grow and propagate in pure, and even ultrapure, water.

There are several ways in which microorganisms growing within a water purification system can impact water quality. For example:

  • They produce undesirable metabolites such as pyrogens, RNAse, and other enzymes that can adversely affect the product that the water is used in.

  • They can grow through 0.22-um filters, thereby impairing the ability of the filters to remove microorganisms from the water.

  • At sufficiently high titers, many microorganisms can form biofilms that are highly resistant to elimination, can promote the growth and proliferation of pathogens, and may periodically detach and be released into the water stream, resulting in sudden peaks of contamination.

Contamination of water affects not only analytical techniques that the water is used in, but also secondary items and processes such as culture media preparation and equipment/glassware cleaning. Because of this, it is important to monitor water quality to ensure that the waters cleanliness and microbial contamination levels are acceptable for its intended purpose.

Water quality requirements

Many water purification systems are monitored continuously for a variety of water quality indicators, and automatic alerts are triggered by the system when out-of-the-ordinary levels are detected.

The settings used to define what level of water quality is acceptable can vary depending on the sensitivity of the application and the level of quality needed. Water used in industrial, clinical, or laboratory settings is typically classified as being one of three types, as follows:

  • Type 1, or ultrapure, water is the most pure type and is suitable for use in HPLC, cell culture, mass spectroscopy, and other applications that require a very high level of purity.

  • Type 2, or pure, water is typically used for making media, buffers, and other solutions that are less sensitive to water quality than those listed above.

  • Type 3, or primary grade, water can be used when cleanliness is required but high levels of precision are not, such as for filling water baths and cleaning glassware.

Somewhat confusingly, different water quality standards use similar terminology to denote slightly different things. One of the most widely used standard is that delineated by ASTM International. The most recent version of the document defining these standards, ASTM D1193-06(2018), was published in 2018. ASTM International defines water purity using a scale ranging from Type 1 to Type IV that takes into account a variety of factors such as conductivity, pH, and mineral content. It also provides a substandard that assesses the level of microbial contamination based on the amounts of heterotrophic bacteria and endotoxin (a bacterial byproduct) that the water contains.

The other most commonly used scale is the ISO standard for water for analytical laboratory use, the ISO 3696:1987, which was also updated in 2018. The ISO scale defines Grade 1 to Grade 3 water qualities, with Grade 1 being the highest. Similar to the ASTM standards, the ISO standard assesses a variety of water properties; however, it does not specify appropriate or tolerable levels of microbial contamination for the different grades of water.

 

Water Purification System

 

Water treatment process and water quality monitoring

Purification systems employing a number of distinct purification technologies are used to achieve water quality levels in line with the standards described above. Ideally, the water purification system will address a range of contaminants, such as minerals, organic compounds, and particulates, as well as of course microorganisms.

To specifically eliminate microbial contamination of water, an optimal water treatment process will use techniques such as:

  • Reverse osmosis (RO), which forces water through a membrane with very small pores that block the transit of most bacteria and viruses.

  • Ultraviolet light, which kills bacteria and viruses directly (although this approach does not remove killed microorganisms from the water). UV lamps emitting 185 nm wavelength generate ozone from oxygen, an oxidizer that kills microbes, while lamps emitting at 254 nm target the DNA of viruses, fungi and bacteria.

  • Filtration, generally with a 0.22-um filter, which removes bacteria from solution.

Maintaining a high flow by designing the purified water distribution through a recirculation loop is an efficient way of maintaining water quality. Preventing water stagnation limits the potential formation of biofilms.

Noticeably, ion exchange, typically mixed-bed, and either single-use or regenerable, which removes charged ions from solution have no effect on microbes and can even be in some circumstances a source of microbial prolifération.

Unlike other contaminants like particulates or dissolved minerals, removing the majority of microorganisms is not a permanent step, as trace amounts can repopulate the water to generate detectable levels of contamination. Thus, systems should ideally be monitored continuously for the presence of microbes to ensure that the different techniques that are applied are effectively maintaining the microbial content at levels low enough that they will not have a negative impact on the final product.

Unfortunately, the majority of existing monitoring systems are not designed to assess the level of microorganisms. Thus, the need to constantly maintaining water purification systems, and monitor microbial levels in particular, can represent a substantial burden for a busy organization. Fortunately, standalone monitoring systems are now available that are easy to use and do not require any specialized equipment or laborious manual recording. Integrating such a system into your water purification process could be the key to successfully preventing regrowth of microbial contaminants in storage tanks, distribution piping, and other equipment.

Conclusion

  • Water quality is critical for many applications, and microbial contamination is a serious threat to the quality of water used in a range of industrial processes.

  • The presence of microorganisms in a water purification system can impact the water quality in several ways, such as the production of undesirable metabolites, impairment of filter function, and formation of biofilms.

  • Many currently used water quality indicators can be measured continuously by the water purification system and trigger automatic alerts when levels exceed the specified limits. However, most commonly used indicators do not assess the level of microbial contamination in water purification systems, so an alternate approach is needed.

  • Maintaining the cleanliness and sterility of water purification systems, and monitoring microbial levels in particular, can be a burden for a busy organization.

  • Easy-to-use, stand-alone, equipment-free, and pen-free monitoring solutions are now available that can record and analyze microbial counts to help prevent microbial regrowth in storage tanks and distribution piping.

Autres articles:

Evolution of microbiology Techniques

A brief history of microbiology

Where it all began

In the middle of a global pandemic caused by a novel virus, it can be hard to picture a world in which microorganisms are completely unknown. And yet, humankind lived in ignorance of our microscopic companions until 1683, when Antonie van Leeuwenhoek first spotted “animalcules” cavorting in drops of water. Using the most powerful magnifying lenses built to date, van Leeuwenhoek was able to see microorganisms, but it would be centuries before their role in human health, disease, food spoilage, and more would be understood.

Early events in the history of microbiology

Over the 200 years following van Leeuwenhoek’s discovery, incremental advances demonstrated more and more convincingly that tiny creatures that could not be seen with the naked eye did exist all around us, in every niche of the environment, and were responsible for a host of effects:

  • In 1765, Lazzaro Spallanzani investigated whether food spoilage occurs inherently as a property of the food itself, or rather is caused by some unknown factor present in the environment. To do this, he boiled broth and placed it in sealed or unsealed jars and found that only the broth in the unsealed jars became cloudy and spoiled, demonstrating that an environmental factor was responsible for this phenomenon.

  • In 1847 Ignaz Semmelweis recommended that doctors and surgeons wash their hands in dilute chlorine solutions to reduce the transmission of an as-yet-undefined harmful factor from autopsied corpses to living patients; this simple measure reduced the rate of mortality at his hospital four-fold within a year.

  • In 1849 John Snow single-handedly created the field of epidemiology by carefully analyzing patterns of cholera in London and concluding that, rather than arising from “miasma (unhealthy air), this disease was in fact transmitted by an unknown factor in water, as it appeared to spread throughout the population from water-drawing sources.

  • In 1857, Louis Pasteur definitively debunked the theory of spontaneous generation, in which an airborne “life forceof some kind was thought to be responsible for contamination and spoilage, by performing an experiment similar to Spallanzani’s but with specially designed swan-neck flasks. These flasks had a long, curved neck designed to trap environmental microbes and prevent them from entering the flask, while still allowing air to enter. Broth boiled in these flasks remained clear and unspoiled unless the swan neck was snapped off, thus proving that microorganisms, and not a life force in the air, was responsible for these effects.

Pasteur’s seminal experiments indisputably demonstrated a cause-and-effect link between microorganisms and food spoilage, and this milestone can be considered the birthday of microbiology as a science.

More recent milestones in the history of microbiology

As microbiology established itself as a discipline, a series of key inventions and discoveries led to its expansion and application:

  • In 1882, Robert Koch defined Koch’s postulates, which are the criteria used to define a causal link between a disease and a specific microorganism.

  • In 1884 Christian Gram reported a staining protocol, now referred to as Gram staining in his honor, that distinguishes broad classes of bacteria with different cell membrane composition. These two classes of bacteria are referred to as “Gram-positive” and “Gram-negative,” and Gram staining is routinely used in the clinic and in research settings today to characterize bacteria.

  • In 1892, Dmitri Ivanowski discovered the first known virus, tobacco mosaic virus. Until that point, all known infectious microorganisms were bacteria, so the discovery of a much smaller infectious factor that turned out to be only one example of a host of infectious agents was revolutionary.

  • In 1928, Alexander Fleming discovered penicillin, the first antibiotic. The serendipitous discovery of this substance, which is naturally produced by some molds, would eventually lead to an explosion in the number of naturally and artificially produced antibiotics that are now available for manipulating and controlling bacterial proliferation.

  • The last milestone of the 20th century is the sequencing of the first complete bacterial genome. In 1995, the genome of Haemophilus influenzae, the bacterium that causes flu, was published by the Institute for Genomic Research, yielding unprecedented insight into a microbe at the molecular level.

Importantly, as the world of microbiology has continued to expand, we have learned more and more about the role of microbes beyond medical concerns; for example, the positive (fermentation) and negative (food safety) effects that microorganisms have in the food industry. These discoveries are facilitated in part by the increasing sophistication of the tools we have for identifying and describing microbes, as discussed in more detail in the following section.

The expanding taxonomy of microbes

In the early days of microbiology as a scientific discipline, different microorganisms were primarily distinguished by the physical appearance of the colonies that they formed on growth media. For example, an organism could be described by the general shape of the colony, whether round, filamentous, irregular, or rhizoid. Colonies could also be described in terms of how high they grew off the surface of the media, also known as their elevation; some common elevation patterns include raised, flat, or convex colonies. A third parameter that was often used to describe colony appearance is the shape of the colony edges, such as whether they are smooth (entire) or lobate.

The ability to observe microbe colony shape, morphology, and more was made possible by the development of pure culture techniques and the formulation of specific culture media. Later advances led to the ability to differentiate and identify microorganisms based on their metabolism, not just their appearance. The taxonomists put this new tool to work, and the number of known species grew from dozens, to thousands, to millions, and now potentially to the billions. These enhanced discovery tools led to an acceleration not only in the identification of species, but also of their consequences on human health and activities, their natural habitat, and contamination routes. As modern technologies continue to develop, they can emulate, accelerate, and democratize this expertise.

Conclusion

For three centuries we developed our ability to observe, characterize, and Identify microorganisms. Contemporary molecular biology and other advanced analytical techniques have continued to shed new light on these early findings and will undoubtedly continue to generate masses of new information on the identity of microorganisms and how they interact with humankind and our environment.

Principles of pasteurization

Pasteurization is the treatment of a food or beverage product to make it safe for consumption and to improve its shelf life. Unlike sterilization, which uses high-temperature treatment to eliminate all microorganisms, resulting in a product that can be stored indefinitely at room temperature, pasteurization is carried out at lower temperatures and aims to reduce the overall microbial population to acceptable levels that can be maintained at refrigerated temperatures.

The main purpose of pasteurization is to reduced the “bioburden” of the product. The bioburden is defined as the number of contaminating organisms found in a given amount of material before undergoing a sterilizing or pasteurizing procedure. These organisms may include bacteria, yeast, and molds, all of which can contribute to food or beverage spoilage. Some microbial contaminants are also pathogenic and can cause illness when ingested, making it imperative to eliminate them from products intended for consumption.

 

Common heat-resistant microorganisms

Some of the most common microorganisms responsible for contamination within the food and beverage industry are well-known, due to outbreaks and products recalls, and include, among others:

  • Salmonella, which causes the classic “food poisoning” and is linked to inadequately cleanliness and sterilization;

  • Clostridium botulinum, which is often associated with canned goods that have been improperly preserved; and

  • Coliform bacteria such as E. coli, which are typically introduced through fecal contamination.

 

An awareness of the types of contaminating microbes that are likely to be present in your product is important, as microorganisms have varying levels of susceptibility to heat-mediated killing. Generally speaking, gram positive bacteria are more resistant to heat than gram negative bacteria such as the three examples cited above, so if your product has a history of contamination with gram positive bacteria, a more intense pasteurization process may be needed. Similarly, bacteria that form spores are more resistant to heat than those that do not, and require higher temperatures and/or longer pasteurization time to adequately eliminate.

Another factor to consider is the nature of the product itself, as product composition can affect how easily or quickly microbial contaminants are killed. For example, the pH of a solution can have a marked effect on the speed of bacterial killing, with pasteurization being more effective at high and low pHs and less efficient at mid-range pHs. The presence of oil, fat, and other substances in the product can also affect the efficiency of pasteurization processes, highlighting the importance of designing a pasteurization protocol that is specific to and appropriate for your product.

 

How to determine the optimal pasteurization settings

The principle of the pasteurization procedure resides in reducing the bioburden by x logs by applying heat for a defined amount of time. A key parameter in designing a pasteurization process is determining the optimal duration of heat application that is needed to achieve the desired level of microbial killing.

 

 

Generally speaking, this time period is defined as follows:

i) The time needed to reduce the bioburden of the most resistant organism in the solution by 1 log; or

ii) The time needed to reduce the maximum bioburden in the product by 1 log.

iii) Target number (N) of survivors after pasteurization (N = 1, when the target number of survivors is 0, as in the example above)

T = D value x log (B/N)

 

 

Once this calculation has been made, it is important to add a safety factor (SF), which is an additional period of treatment beyond what was calculated to be strictly necessary, as an extra buffer against leaving residual contamination. As a general rule of thumb, an SF of 6 log is considered sterilization.

When selecting a target SF, keep in mind that an intermediate value is probably best, as aiming for a high SF can result in undesirable degradation of the product; in the case of food and beverage products, applying too much heat or for too long can negatively affect their sensory properties, such as color, taste, and smell.

Aiming for an overly high SF can also reduce the efficiency of the process, as it requires more energy and time than a briefer pasteurization step. On the other hand, it is important not to underestimate the optimal SF, as applying inadequate heat or for too short a time can result in residual contamination of the product.

The best way to go about defining the appropriate SF for your product and ensuring that the pasteurization process is achieving its intended goal is through process monitoring.

 

Monitoring pasteurization efficacy

Monitoring process input and output is crucial for ensuring that the selected pasteurization conditions are performing adequately and that no unexpected contamination events take place throughout the process or over time. Regularly testing the production chain at different stages can provide a snapshot of the overall level of microbes in the product, as well as sound an early warning in case of unanticipated issues. 

The first step is to monitor and record the bioburden of the original, unpasteurized, product: that is, to determine how many microbes are going into the process. To accomplish this, we recommend routinely testing products before they enter the process and maintaining a complete and detailed log of the prepasteurization data history.

After the pasteurization process is complete, the product should be monitored systematically to ensure that the target bioburden reduction has been achieved, and that the selected parameters are still appropriate and effective. As with the prepasteurization levels, it is important to maintain a clear record of these values as well, to have a comprehensive picture of the process performance over time. If the product will be subjected to other steps postpasteurziation, such as packaging, it is advisable to also monitor the packaging environment and equipment to check that the product does not become recontaminated after the pasteurization step.

Regular monitoring of the production environment, processes, and equipment is key to improving process efficiency and product conformity, as well as ensuring that your industrial processes are safe and produce products of consistent quality.

 

Conclusion

  • The aim of pasteurization is to reduce the bioburden of a product through heat-mediated microbial killing. Designing an optimized pasteurization process will help not only maximize product appeal and avoid spoilage, but will also reduce resource wastage.

  • When optimizing a pasteurization process, keep in mind that an overly high SF may unnecessarily deteriorate the product and/or degrade process efficiency, while an overly low SF may result in residual contamination of the product.

  • It is important to regularly check the prepasteurization bioburden to verify that the pasteurization step will have the intended effect.

  • It is crucial to monitor the cleanliness of the process environment and equipment postpasteurization to ensure the product does not become recontaminated between the pasteurization and filling steps.

Recovering from operations shut down is a unique experience. Can we limit the pain and find some gain ?

 

Operations may be slow now, but will ramp-up again sooner or later.

Very unusual circumstances caused the shut-down and the recovery will probably be also.

At Pinqkerton, our closest experience (from a previous life), to what we might be living soon, is a process transfer from one location to another. Based on that experience, we believe that this recovery will require a good dose of planning but is an opportunity to learn and progress.

In the case of a process transfer, most of the start-up (i.e. the recovery) at the receiving end depends on the shut-down and transfer of knowledge.

The differences with today are (i) this shut down was potentially organized in a bit of a rush (ii) the process understanding that was not developed before the shut down will not be available for the recovery.

We hope the experiences, information and advices collected from various sources (mainly from the food industry) and shared hereafter will provide you with at least one actionable idea that will help you recover faster and better.

So we start with an evaluation of the process state after shut down and quarantine, what we know about it and build the knowledge that is missing.

 

What do companies who go through seasonal shut-down pay attention to ?

 

A shut-down has a number of consequences: 

  • Water stops circulating : its quality drops
    That can be a concern since water is used among other things, for cleaning
  • Wet areas : humidity, condensation, residual water .. lead to microbial proliferation and development of biofilms
    72h is often used as a reference in the food industry as an action limit to trigger a re-washing. (max 24h is a target)
  • Dry areas : crusting takes place and standard washes are not enough to remove them
    Remaining crust will act as a anchor for new biofilm or other contamination
  • Pests : the facility has known unusual calm for an unusual long time and unusual visitors can be expected
  • Dryness damages seals or other parts…
  • Humidity favors contamination of the air system, vents…
  • Uncontrolled temperature leads to premature material aging, spoilage…

Good shut down practices in seasonal farming suggest measures that are generic and important for their restart. They include:
  • Empty/ purge all lines and containers
  • Apply the most thorough cleaning procedures e.g. alkali wash followed by acid rinse.If possible: allow to dry and close or leave in a protective solution e.g. peracetic acid
  • Close all networks (compressed air, water, CO2, etc.)
  • Disassemble all equipment that can be, prior to inspection and washing : unions, filter housings, heat-exchangers, etc
  • Scrub all accessible surfaces, including inside vats and silos, in safe conditions. 
  • Isolate and close packages of started materials, (if possible, put under vacuum)
  • Close the drains to avoid the rise of contaminating microorganisms
  • Install grids to avoid the rise or entry of rodents
  • Take stock and prevent looting / burglaries
If these steps were not taken during shut-down, they could or should be during recovery.

 

What do we know that is important for recovery ?

 

We know at least 2 things that are important: the process and the product susceptibility to contamination, the combination of which is reflected in the historical production yields and/or spoiled batches.

Hygiene by Design dictates that the exterior and interior of all equipment and pipework must be self-draining or drainable. In the case of external surfaces, any liquid must be directed away from the main product area. The design rules include choice of material of construction and other sometimes industry-specific considerations.

Some of our conception flaws can be detected by observation, such as areas of water stagnation which point at potential vulnerabilities to contamination.
Now is the opportunity to look, discover and improve.

Then, we also have our production history as a source of information.The documentation and/or personal experience tell us which intermediate or finished product, which particular tank, supplier or process step has been  most problematic.
If those production pain points present a higher risk of contamination in normal conditions, they probably hurt even more during recovery.

We can think of actions such as re-inforced inspection, avoidance (recover with manufacturing other « easier » products first, before manufacturing the more complicated ones) or upgrade the process.

This last option, in our experience, should be considered with care because of 2 conflicting propositions: 

  • First, if we have a known weak point in a chain and we know times ahead will be challenging, then dealing with the weak point might well increase our chances of success. It is good preparation.
  • On the other hand adding change to a crisis situation, when resources are spread thin and thinking clear not easy, could bring the entire organisation to a stop if anything unexpected happens. It is taking unnecessary risk.
    The decision might be taken considering a « risk/benefit » ratio, return on investment and our capacity to do it right the first time?

Evaluation

Importance

Appreciation

Score

Criteria

x 1 = Important

x 3 = critical

0 = Not Satisfactory  

2 = Very Satisfactory

Importance x Appreciation

Hygiene by Design 
applied to the process ?

3

1

3

Level of validation

2

1

2

History of problems

2

2

4

Cleaning during shut-down

1

0

0

Total

Sum scores / Maximum score

56 %

  

Everything in this table is up for customisation, including the granularity for the criteria and evaluations. It may facilitate an assessment that can be explained, shared and feed a plan for recovery. A more comprehensive version can also be the starting point for a longer term Operational RoadMap, with annual objectives to improve the score.

 

How do we know we have recovered ?

 

For those processes and procedures that are validated, the key performance indicators are known, acceptable limits are specified etc.
Re-start could look like a walk in the park (relatively)

The answer to that question is made more difficult when we have insufficient or partial data history. If we don’t have measures or specifications to indicate what normal/satisfactory/improved are, we can remember Peter Drucker : If you cant measure it, you cant improve it »

The missing historical data can be built on this occasion.
Before the actual restart, we can test critical control points, points-of-use, water at various purification and storage steps, etc. which will give a reference point helpful to evaluate the efficiency of the cleaning and other recovery activities.
Chances are, that data will come useful again, and again for years to come

Preaching one second for our products: the nomad testers can really come handy in these circumstances.

 

A few points of attention


  1. Recent public debriefs from companies that have managed to function through this confinement period, one in Food and the other in PCP, both highlighted the importance of communication.

    Their point is that for a group to function in unusual conditions, everyone needs more than ever to be and feel listened to, considered and be lead rather than managed.
    In these experiences, the energy and creativity that were liberated really made the difference.

 

  1. Those large companies also had a wealth of experience and documentation i.e. a large toolbox from which they could draw. Here are some we thought might be helpful:

    • Beware of washing on biofilms or dry crusts well installed: start washing with a cold phase to remove organic and especially protein residues which hot coagulate on the walls and are very difficult to remove, and then form points of hook for subsequent deposits and biofilms.
    • Watch out for air systems, air intakes and introduction into the process.
    • Monitor fines (dust of various origins which are vectors of cross-contamination of the environment between zones ) in the workshops. Where are these fines generated? where do they circulate?
      Places where fines deposit risk harboring contaminations that might adapt, become resistant to the sanitation process,particulalrly in process dead-points.
    • Are there any plans to rework rework? how was it stored? Where will the rework be introduced in the process?
    • Atmospheres: has the temperature been lowered? Was it only when everything was dry? Has there potentially been any condensations?

       

      1. A scrupulous inspection is necessary before restarting

      In many cases, visual inspections are the main or only way to check that a particular criteria is met, even in normal activities.

      Training new personnel or standardizing the acceptable limits is not always simple for those visual controls. For that purpose,  building a library of different defects, labelled « acceptable » or « not » can be very useful. `

      There are rare occasions to enrich these libraries and a restart is probably one of them.

      We find that taking pictures with smartphone is a smart way of doing that. It does not cost much and can prove very valuable

       

      1. Gather data

      An unusually high level of testing might be desirable to satisfy yourself and potential auditors that the process is back under control.  

      In case of sensitive materials and or processes, swabs will be necessary if only to measure TVC (Total Viable Counts) and a few hygiene process specific indicators, at sensitive points identified during an inspection before recovery. 

      Same goes for equipment rinse water for equipment internal surfaces.

      Identification of the germs found would allow the use of cleaning agents well targeted for their specific efficacy and at the effective dose as well as doing some plant mapping. 

      Mapping is helpful to determine the origin and movements of contaminants in a site. If we know what flora is in the process water, what is typically from a raw material, and another form the skin, when we find  a contamination in a product, the origin of the contamination is easier to investigate.  

      This approach applied in routine can be a luxury for some activities, as a result of (i) the testing costs being to high compared to the product value and (ii) because the results of swabs or CIP rinse water test (before second washing and possibly post second washing), would not be known until after the manufacturing has resumed. The delayed results may expose to a difficult decision : what if a troubling result is found in retrospect, after the manufacturing restart, that is not part of the release criteria ?

      For that reason maybe, some manufacturers consider that the first batch, scrutinized, should be discarded no matter what.

      But such an approach might be acceptable in a recovery "study" mode during the upcoming start-up, to enhance our process understanding, which is a source of lasting future improvements and savings.

       

      Where are the opportunities in all that ?

       

      As a supply chain management expert pointed out this week, that we are living an opportunity to improve in a number of ways: 

      • Financially: because cash is priority number 1 right now, short term profitability left aside, if we look at our activity focussed only on current and future cash-flows, we may reveal improvements: contracting the supply chain, resizing batches and equipment, reconsider the value of a particular sales channel or customer type
      • Sales : some companies will probably not survive through this death valley unfortunately, which opens new markets for those who survive
      • Organisation : observe the time between taking important decisions and their realisation.
        What is slowing the process: communications, planning, adapting operations, lack of automation ?
      • Strategic : developing agility and resilience.
        What is our process for anticipation and decision making. How does it work when our vision of the future is not clear ?
        The post crisis consumer habits, behavior at work, economic models might be different : place bets on a few scenarios, prepare a response and pull the trigger when one of the bets becomes reality

       

      Conclusion

       

      Several executives from reputable companies converge in the opinion that recovery is an opportunity to become more agile and resilient.

      This is consistent with our experience in process transfers.


      It starts with observing our processes, organisation and cash-flows to identify vulnerabilities. It continues with anticipating, adapting while measuring the progress.

      It may sound as the kind of conceptual thinking exclusive of large companies, but are these opportunities not real for all companies, whatever size and business ?

      Like for exceptional holidays, these exceptional days call for observing, taking pictures, comparing, so we can bring new habits and ideas home, on which to feed for a long time.

      Thank you for reading this. 
      If you would like more publications like this one, please "Like" or comment. 
      Wed love to read from you, share and help !
      As a token of our appreciation, enjoy free shipping on our products, with the code ILIKEBLOG4

      contact@pinqkerton.com

       

      …and remember,

      Q: Why did the germ cross the microscope ? 

      A: To get to the other slide !

      4 contrôles avant d'utiliser un kit de terrain autonome microbien à votre satisfaction

      In our last article, we wrote about the usefulness of field kits and their potential applications, inspired by ISO 17381 « Selection and application of ready-to-use test kit methods in water analysis ».

      There are 4 main chapters in this standard we should consider:

      1. Define what will the kits be used for : that's the scope of applications
      2. Check the kit works, which is essentially making sure a component of the sample does not interfere with the detection technique
      3. Document what we do
      4. Train those who will do

      Since such kits are generally designed to be simple to use and are derived from lab techniques, the potential interferences (second chapter) are usually known by looking at the litterature and not much work is required.

      1-Define what we want the kit to do in practical terms


      What is the range of detection ?
      For example, a suppliers recommends an analytical limit for the equipment we just bought : « we cannot garantee the equipment will function properly if microorganism concentration is above XX ». We want a range of detection centered on that XX limit, +/- 1 log

      Is the product you want to test physically and chemically compatible with the kit technique?
      For example, if the product contains suspended matter or is opaque, a kit based on an optical detection may no be the simplest route to go down

      Will it be easy to live with once used in routine?

      Suggested topics to consider are:

        • The rapidity of set-up, of handling, of results
        • There is mobile and mobile. Should it hold in a pocket or in a briefcase?
        • Cost of initial (equipment) acquisition, cost per test, cost of ownership (training, maintenance, ..).
        • How much confidence in the results do we need to make the right decisions?
          For example a semi-quantitative may have the advantage of cost or rapidity, but generate more false positives and negatives than a quantitative technique. That can be OK …or not
        • Frequency of use for routine testing or for troubleshooting. Does (frequency x cost per test) fit with our budget?
        • Is a correlation with a reference or lab results critical, important or nice to have?
          The question can be particularly important with microbiology tests, since different techniques can give very different results.
        • Availability and ease of acquisition : if we opt for using a field kit, we don’t want to risk holding up our operations for a matter of supply.

      You decide what you want !

      « If one does not know to which port one is sailing, no wind is favorable. » Seneca

      2- Will it work?


      The biggest watch-out is for « interferences »

      With microbiology kits using a method by culture, preservatives or biocides will interfere with the microorganisms growth, since that is the reason for them being there.

      Chlorine is added to tap water to prevent microorganisms from multiplying in the distribution piping, preservatives are added to cosmetics, biocides to water baths used to wash solid products…

      Since field kits are often derived from lab techniques, how to deal with these situations is often public knowledge.A lab-developed solution is probably applicable to the field kit, for example neutralization of chlorine with thiosulfate, or sample dilution in other cases.

      A big watch-out, but not necessarily big work !

      3- Document

      The list of things we should document can be broken down:

          • The field of use: what do we want to measure, what range, what nature of products, how are interferences avoided, range of temperatures, pH, storage precautions, shelf life..
          • Usage: instructions on how to use the product, description of reagent and equipment, additional reagents & equipment.
          • Sampling: description of sample quantity, volume, handling.
          • Protocol: health and safety, step by step handling (pictogram), reaction time (& intervals), ascertainment of results, cleaning, and maintenance.
          • Results: methods for assessing the results, conversion tables, factors.
          • Disposal (waste).
          • Characteristic of the method, such as calibration, certificate of quality, controls (standards)
          • References to the procedure and additional information (possible applications).

      We can save time by adding this requirement to part 1, as a selection criteria for the kit. The last bullet is : « How much of the documentation can the kit supplier provide? »

      4-Train


      Our organization is responsible for ensuring the method is applied correctly, which implies the users are trained.
      Field tests are typically designed for use in unusual conditions: they are simple and robust.

      ISO points out that personnel must have undergone basic training (by supplier or company) and understands:

          • Test performance
          • Matrix influence
          • Limitations
          • Sampling
          • Dangers and how to avoid them
          • Disposal
          • Quality Assurance
          After what the protocols should be handy and in local language


      The documentation developed in part 3 provides the fundamentals for the training. All that’s left to do is to have samples and time to practice until the results are satisfactory.

      The ISO 17381 introduces itself by highlighting that the use of autonomous field kits can be very convenient.
      In most cases, it will be because the kit supplier is prepared for our requests.


      Remember : Microbiology is like cooking (just don’t lick the spoon)

      6 uses of microbial autonomous field kits that can make your life simpler

      We will discuss here the use of microbial test kits on-site or off-site in a regulated environment to justify company practices or to ensure the results obtained with an official test will pass.

      This article, inspired by the reading of ISO 17381 (Water quality - Selection and application of ready-to-use test kit methods in water analysis), addresses the context and the applications covered in the Standard.

      In water, food, cosmetic, …monitoring, appropriate standardized and sometimes mandatory procedures exist for practically every microbial parameter to be investigated. They constitute the reference methods and are largely based on culture methods.

      These test methods require a laboratory,  equipment and technical expertise that are not always available in all facilities. In such case, companies often chose to subcontract the minimum number of mandatory tests to external labs, who have these resources and skills.

      The limitations of outsourcing all your microbiology tests


      Outsourcing all of your microbiology tests can have limitations such as:

        • Having to adjust personnel and equipment schedule to the sampling schedule
          Don’t we prefer the other way round ?
        •  The analysis reports comply with the regulatory or best practice standards, but is not comprehensive for the production manager.
           It can be a bit like when you receive the results from your blood analysis and try to understand if you are sick or not.
        • Finally, an individual accurate report has value, but the real thing is to have enough results to constitute a baseline, with which  to set alerts when the trend in going in a worrying direction.
           A bit like a road radar : perfectly reliable but too rare to help control our driving speed at all times.

      So calling upon other test methods, more user friendly, can be helpful for day to day operations to complement internal or external lab tests.

      Using Ready-to-use methods as in ISO 17381


      ISO 17381 « Selection and application of ready-to-use test kit methods in water analysis » provides guidance on how to properly use such kits.

      The so-called "ready-to-use methods" are of increasing interest because, compared to standard methods, they allow fast and often inexpensive results for analytical problems. Under certain conditions these methods can be applied in routine control of water quality, provided they give reliable results.

      The methods are not intended as a substitute for other standards, which remain the reference method for use in a laboratory.

      The choice of the most suitable method depends upon the type of analysis required and the necessary quality of the results.

      Ready-To-Use methods are frequently based on standard methods that have been miniaturized to allow their direct application.

      This suggests that in many cases data, references, guidance, expertise required to start using an AFK are readily available from multiple sources and that with limited effort, the AFK results will probably be consistent with those from reference methods, since they are based on similar principles.

      We believe the recommendations put forth in this standard can be extended to applications beyond water quality testing and have inspired this article.

      The different use of Autonomous Field Kits which can ease your microbiological tests


          • 1. Screening : Preselection for samples for further analysis by a laboratory, due to their lower overall cost and easiness to deploy.
            Example : testing batches of intermediate product. When a change in counts or flora is observed, send the test for identification of the suspects.

          • 2. Screening : Selection of the most suitable analytical method due to their lower overall cost and easiness to deploy.Example : comparing CIP protocols and CIP rinse water collection methods prior to testing.

          • 3. Rapid detection after a potential incident due to the rapid availability of the test and consequently of the test results.  
            Example : evaluating the impact on a work environment after an accidental spill of waste product or the introduction of a potentially contaminated product / equipment in  clean area.

          • 4. Limiting the amount of damage after an incident due to the rapid availability of the tests and consequently of the test results 
            Example : after an incident, for monitoring the efficiency of the curative actions e.g. cleaning, disinfection. Like in the previous application, the time-to-result is shortened because the test kit is readily usable, the sample transport time eliminated

          • 5. Control measurements for monitoring/preparing compliance with the permissible concentration range for a given microbial parameters 
            Example : commissioning a piece of new equipment or preparing the validation of a process. Kits can constitute a convenient way for engineers to adjust equipment settings.

          • 6. Monitoring and controlling processes, facilities, production plants, water treatment and disinfection systems
            Example :  routine environmental or hygiene monitoring, equipment and process maintenance programs, quality plans, …

      So, the list of potential applications listed in this Standard covers the majority of a site auto-control testing activities, with the exception of the mandatory compliance tests.

      Considering the autonomy, savings (time and money) these kits can offer they can constitute an interesting adjunct to lab test methods…provided the kits do the job you expect from it.

      A glimpse of the next article : Implementing autonomous testing methods following the ISO 17381


      The second part of the ISO 17381 deals with how one should go about verifying a kit is suitable for a given application, which we will digest in an article to come.

      What are the highlights?

      1. Prove the test suitability for your applications.
      2. Meet requirements for example concerning safety, handling and documentation, because these tests are often used by non-specialists in microbiology
      3. Meet requirements concerning user training and supervision

      Remember : « microbiology is the only science in which multiplication is the same thing as division »