13 April 2026
Reading time [minutes]: 14
Technology and Applied Research
PCR Diagnostics: Technological Challenges and Solutions
Signal drift, reagent stability, quality control, cloud computing and AI: the challenges that still limit qPCR, and the solutions through which Helyx Industries S.p.A. is making molecular diagnostics more robust, scalable and closer to the point of care.
Abstract
Real-time PCR remains the gold standard in molecular diagnostics in terms of sensitivity, speed and specificity, but it is not a technology that has been ‘sorted out’ once and for all. Signal drift, inter-run variability, quality management in decentralised networks, reagent stability and throughput limitations continue to hinder its operational adoption. [1][2] This article analyses the main technological challenges of contemporary qPCR and the solutions emerging across three areas: hardware-software integration, more stable reagents and cloud data layers with supporting algorithms. [3][4][5] Within the new structure of Helyx Industries S.p.A., the scope of this analysis falls primarily within the Hyris Division, dedicated to distributed qPCR and the Hyris System™ platform (bCUBE™, bAPP™, reagents); the clinical/IVD dimension, however, belongs to Vytro, whilst Mytho remains outside the scope as an NGS division.
- Snapshot
- Introduction
- 1. Key technical challenges in real-time PCR
- 2. How is data quality safeguarded when PCR becomes distributed?
- 3. Why reagent stability remains a key issue
- 4. Cloud and AI: where the real benefits and risks lie
- 5. Integrated platform or heterogeneous stack
- Interview // Lorenzo Colombo - CTO
- Conclusions
Snapshot
qPCR (real-time PCR)
A nucleic acid amplification technique that monitors the fluorescent signal produced during the reaction in real time. It is one of the gold standards of modern molecular diagnostics in terms of speed and sensitivity.[1]
Signal drift
An abnormal increase or variation in fluorescence that does not reflect a true exponential amplification of the target. This may be due to inhibitors, primer-probe mismatches, reagent degradation or issues with the normalisation of the reference signal.[1]
Throughput
The ability of a system to process multiple samples within the same time frame. In qPCR, this depends on the number of reactions per run, the duration of the protocol, the degree of multiplexing and the potential for parallelisation.[2][6]
Environmentally stable reagents
Masterbatches, primers, probes or controls formulated to maintain acceptable performance even outside the cold chain, often through freeze-drying or controlled drying.[3][4]
Real-time quality control
A set of internal controls, software rules and instrumental monitoring procedures that enable the immediate identification of questionable tests, analytical failures or anomalous patterns before the result is released.[5][10]
Integrated platform
An ecosystem in which instruments, reagents and software are designed to work together. It reduces compatibility issues, simplifies the standardisation of workflows and makes it easier to manage data quality across multiple sites.[5][7]
Introduction
Real-time PCR has profoundly transformed molecular diagnostics, but its technological maturity does not mean it is without limitations. In laboratories, and even more so in decentralised settings, the critical issues are not solely biological: they concern signal behaviour, reagent stability, reproducibility across instruments, and the ability to manage data consistently when the test is carried out outside the central laboratory. [1][3][5] In other words, the question is no longer simply ‘is qPCR sensitive?’, but ‘how reliable does it remain when it needs to operate in a repeatable, rapid and scalable manner outside a core lab?’ This point has become particularly evident with the growth of point-of-care and near-patient models. Portable platforms have reduced the distance between testing and clinical decision-making, but they have also raised the bar for standardisation: if the result is produced in different locations, by different operators and under different logistical conditions, the system must be capable of safeguarding the data far more effectively than traditional benchtop PCR did.[5][6]
This is where the value of an integrated approach lies. Within the Helyx ecosystem, the solution to these challenges lies primarily in the Hyris System™, an architecture that combines a portable device, software and reagents into a single operational framework.[7][8][9]
In this context, the aim is not simply to miniaturise PCR, but to make it more robust in areas where it has historically faltered: signal variability, reliance on the cold chain, heterogeneity of workflows, and inconsistent interpretation of results.
1. Key technical challenges in real-time PCR
The first critical issue is the signal behaviour. Gunson and colleagues had already described, in the routine practice of diagnostic virology, how real-time PCR could produce delayed fluorescence peaks or patterns that were difficult to interpret, particularly under sub-optimal conditions, with difficult samples, or due to problems relating to primers, probes and normalisation. [1] The issue is not merely aesthetic: when the curve is ambiguous, determining the Ct becomes more complicated and the risk of borderline results, repeat testing or inconsistent interpretations between operators increases.
The second key issue concerns multiplexing. In theory, combining multiple targets into a single reaction increases efficiency and throughput; in practice, however, it leads to primer competition, limitations on the number of fluorescent channels, and greater complexity in curve analysis. The review by Kreitmann et al. shows that many traditional qPCR solutions are in fact limited to three or four targets per reaction, unless sophisticated instrumentation or data-driven strategies capable of extracting more information from amplification and melting curves are employed.[2] This is where the software comes into play: it is not enough to ‘see more colours’; one must be able to interpret the signal more effectively.
The third critical issue is throughput. Even when the chemistry is sound, the bottleneck can be architectural in nature: the number of wells, run times, the sequence of operations, and the need to repeat inconclusive or invalid tests.[1][6] In centralised models, this limitation is managed through large batches and automation; in distributed models, however, it becomes a matter of intelligent parallelisation and coordination between different nodes.[5] [6]
One final aspect, less obvious but crucial, is the variability of the context. Zidovec Lepej and Poljak, in their review of portable molecular tools in microbiology, point out that performance depends not only on the chemistry, but also on environmental requirements, operating conditions and the extent to which the system has been validated outside the reference laboratory. [6] In other words: the challenge of modern qPCR is not only to be accurate, but to remain accurate when the context becomes more variable.
2. How is data quality safeguarded when PCR becomes distributed?
When a diagnostic network becomes decentralised, data quality can no longer be left solely to the discretion of individual operators. It must be ‘built into’ the system: in internal controls, software rules, centralised curve management and the traceability of procedures.[5][10]
The most practical lesson comes from the Belgian SARS-CoV-2 testing programme described by Van Vooren et al. In the midst of a rapid expansion of national testing capacity, the authors had to integrate a variety of instruments, reagents and protocols, whilst linking them to a single clinical-IT infrastructure for automated analysis, continuous QC monitoring and harmonised reporting across sites. [5] This is a very useful case study because it demonstrates that decentralisation does not necessarily require absolute hardware uniformity, but undoubtedly requires logical uniformity in data control, analysis and governance.
The same principle is also evident in more recent point-of-care software platforms. In the study by Su et al., a mobile detection-service system was designed to monitor distributed devices, synchronise data with servers, classify abnormal curves using machine learning, and display the results on remote terminals. [10] The value lies not only in access to results, but in the fact that the software becomes an active part of the analytical process: it observes the curve, assesses it, contextualises it and, if necessary, flags anomalies before the data is treated as definitive. The logic of the Hyris System™ fits within this framework. In the work on bCUBE™ and the Hyris kits, the platform is described as an integrated set comprising a portable instrument, the bAPP™ application and a dedicated workflow, designed to produce reliable results even outside a fully equipped laboratory.[7][8][9]
This does not eliminate the need for validation or governance, but it does reduce some of the friction typically associated with heterogeneous stacks: fewer manual steps, less need to integrate components developed separately, and greater continuity between signal generation and its interpretation.
3. Why reagent stability remains a key issue
The robustness of a PCR reaction does not depend solely on the thermal cycler. It also depends to a decisive extent on the stability of the reagents. Enzymes, primers, probes and controls are sensitive to freeze-thaw cycles, time spent outside the refrigerator and transport conditions. Lopez et al. have shown that the stability of certain qPCR components may be good under controlled conditions, but deteriorates progressively when repeated freeze-thaw cycles and sub-optimal handling of mixes and stocks are combined.[3] This is particularly important when testing moves from a centralised laboratory to peripheral settings, where the cold chain may be more fragile.
This is where stabilised and freeze-dried reagents really change the game. The study by Qu et al., although not the most recent, remains a clear benchmark: vacuum-dried reagents for real-time PCR were found to remain stable for at least 49 days at 37 °C, whilst maintaining accuracy and reproducibility. [4] This finding is significant not only for ‘extreme’ applications, but for the entire concept of field deployment: less reliance on freezers and dry ice means lower logistical risk, less waste and greater predictability in real-world use. For Helyx, this issue lies at the intersection of platform and application. Within the Hyris scope, the stability of the reagents enhances the portability of the solution; when the same logic translates into kits and workflows for laboratories and hospitals, the industrial relevance moves closer to the Vytro scope. This is an important point because it demonstrates how, within the group’s new structure, chemistry is not an accessory to the device but an integral part of the reliability strategy.
4. Cloud and AI: where the real benefits and risks lie
AI applied to PCR is often discussed in general terms. In reality, the practical benefit is not ‘performing PCR using artificial intelligence’, but rather using data-driven models to better interpret the results of PCR: amplification curves, melting curves, borderline signals, anomalous patterns, or multiplex combinations that are difficult to classify using rigid rules.[2] [10] Kreitmann et al. demonstrate how machine learning and digital analysis can extract useful information from amplification and melting curves to enhance classification capabilities in multiplex qPCR, thereby partially overcoming the limitations of fluorescent channels alone.[2] Su et al. take a complementary approach: they use ML to classify anomalous curves and calculate the Ct of the positive curve within a distributed software platform.[10]
In both cases, the message is the same: qPCR data is not merely a threshold, but a signal structure that can be better interpreted using more advanced computational tools. The operational advantage of the cloud, however, lies primarily in standardisation. If multiple instruments generate data at different sites, a centralised platform can apply the same validation rules, the same reporting criteria and the same consistency checks. The Belgian case demonstrates that this architecture makes it possible to monitor the QC of a complex network in real time and harmonise results generated in very different environments.[5] The risk, however, remains. The more data passes through complex software layers, the greater the need for clear audit trails, transparent rules of interpretation and well-defined boundaries between automation and validation.
In other words, the cloud does not automatically solve quality issues: it makes them more manageable, but only if the operating model is well designed.
5. Integrated platform or heterogeneous stack
In theory, a heterogeneous stack offers freedom: you can choose the best reagent, the best software, and the best instrument. In practice, however, this freedom often translates into complexity. Every interface between different components is a potential source of friction: protocols to harmonise, data formats to convert, parameters to realign, and responsibilities to assign when something goes wrong.[5]
This is why integrated platforms have gained ground. In the case of Hyris System™, the peer-reviewed literature published on bCUBE™ and the Hyris kits emphasises precisely this architectural consistency: hardware, workflow and software have been designed to work as a system, not as a collection of separate components. [7][8][9] This does not mean that integration solves every problem, but it does mean that many problems are addressed at the outset, during the design phase, rather than being left to the end user.
In distributed environments, this advantage is amplified. When testing is carried out at a remote location, by non-specialist staff or under very tight deadlines, the system must minimise unnecessary degrees of freedom. An integrated architecture does not eliminate the need for quality; it makes it more achievable.

Lorenzo Colombo
Interview
Q: If you had to choose one technical issue in qPCR that is still underestimated, which would you point to?
Lorenzo Colombo: The first thing I’d say is this: PCR is a mature technology, but that doesn’t make it a ‘simple’ one. When you take it out of the ideal laboratory setting, you realise that the details still matter immensely. Curve quality, sample behaviour, consistency between runs, reagent management: these are all factors that determine whether a system holds up or not.
Q: Signal drift seems like a very technical issue. Why should it also be of interest to those looking at the system as a whole?
Lorenzo Colombo: Because it’s not just a matter of interpreting the curve. It’s a symptom. It tells you that molecular diagnostics is a system, not a single component. If the signal behaves erratically, the problem may lie in the chemistry, the optics, the sample, the interpretation algorithm, or the interaction between these elements. For me, drift is almost a methodological reminder: it reminds you that you cannot design instruments, reagents and software as if they were three separate worlds.
Q: How much does it really change the game when reagents become more stable at room temperature?
Lorenzo Colombo: It changes much more than you might think. It’s not just a matter of shipping without dry ice. It means making the system less fragile in the real world: fewer critical steps, less waste, less reliance on infrastructure that, in many contexts, cannot be taken for granted. And when you remove fragility from logistics, you also remove fragility from adoption.
Q: When we talk about the cloud, the discussion often remains abstract. What, in practical terms, is the benefit?
Lorenzo Colombo: The real benefit is consistency. If you have multiple tools, multiple sites and multiple operators, the cloud isn’t just about ‘viewing results remotely’. It ensures that those results are generated, interpreted and stored within a common framework. That is where comparability is built. Without this level of integration, you risk having lots of tests but no real system.
Q: Why do you place such emphasis on an integrated approach rather than a more open, modular stack?
Lorenzo Colombo: Because, in theory, maximum freedom is appealing, but in practice, absolute freedom often shifts complexity onto the customer. And in diagnostics, complexity is not neutral: it translates into time, risk and variability. An integrated system does not have to be rigid; it must be consistent. That is the key difference.
Q: What do you think is the next credible evolutionary leap for qPCR?
Lorenzo Colombo: I don’t believe in a single breakthrough. I see a convergence of improvements: more robust reagents, smarter software, better-controlled data quality, and more reliable miniaturisation. The aim is not to make PCR more spectacular. It is to make it more predictable, more scalable and easier to take to places where it is currently still too complex to manage.
Conclusions
qPCR remains a key technology, but its evolution is not solely about increasing sensitivity. It hinges on the ability to better control the factors that make it truly scalable: signal quality, reagent stability, data standardisation, portability, and integration between hardware, software and chemistry.[1][2][3][5]
The available evidence points to a clear trajectory. On the one hand, research is increasingly focusing on curve interpretation and multiplexing management.[2][10] On the other hand, portable platforms and distributed workflows are demonstrating that it is possible to shorten the time between testing and decision-making without compromising robustness, provided that the system is designed as an architecture rather than simply as a device. [5][6][7][8][9]
In the case of Helyx Industries S.p.A., this trajectory is reflected above all in the work of the Hyris Division on distributed qPCR and the Hyris System™. The strategic lesson, however, is broader: in next-generation molecular diagnostics, competitive advantage does not stem from a single outstanding component, but from the ability to integrate platform, data and process.
Sources and Bibliography
[1] Gunson RN, Collins TC, Carman WF. Practical experience of high throughput real time PCR in the routine diagnostic virology setting. J Clin Virol. 2006;35(4):355-367. DOI: 10.1016/j.jcv.2005.12.006. Link: https://pubmed.ncbi.nlm.nih.gov/16460999/
[2] Kreitmann L, Miglietta L, Xu K, et al. Next-generation molecular diagnostics: Leveraging digital technologies to enhance multiplexing in real-time PCR. Trends Anal Chem. 2023;160:116963. DOI: 10.1016/j.trac.2023.116963. Link: https://pubmed.ncbi.nlm.nih.gov/36968318/
[3] Lopez MLD, Jennings WC, Henson MWR, et al. Effects of storage conditions on the stability of qPCR reagents: implications for environmental DNA detection. BMC Res Notes. 2024;17:199. DOI: 10.1186/s13104-024-06850-4. Link: https://pubmed.ncbi.nlm.nih.gov/39026307/
[4] Qu S, Shi Q, Zhou Y, et al. Ambient stable quantitative PCR reagents for the detection of Yersinia pestis. PLoS Negl Trop Dis. 2010;4(3):e629. DOI: 10.1371/journal.pntd.0000629. Link: https://pubmed.ncbi.nlm.nih.gov/20231881/
[5] Van Vooren S, Tirez K, Ceyssens PJ, et al. Reliable and Scalable SARS-CoV-2 qPCR Testing at a High Sample Throughput: Lessons Learned from the Belgian Initiative. Life (Basel). 2022;12(2):159. DOI: 10.3390/life12020159. Link: https://pubmed.ncbi.nlm.nih.gov/35207446/
[6] Zidovec Lepej S, Poljak M. Portable molecular diagnostic instruments in microbiology: current status. Clin Microbiol Infect. 2020;26(4):411-420. DOI: 10.1016/j.cmi.2019.09.017. Link: https://pubmed.ncbi.nlm.nih.gov/31574340/
[7] Martinelli F, Perrone A, Della Noce I, et al. Application of a portable instrument for rapid and reliable detection of SARS-CoV-2 infection in any environment. Immunol Rev. 2020;295 Suppl 1:4-10. DOI: 10.1111/imr.12857. Link: https://pubmed.ncbi.nlm.nih.gov/32329102/
[8] Miscio L, Olivieri A, Labonia F, et al. Evaluation of the diagnostic accuracy of a new point-of-care rapid test for SARS-CoV-2 virus detection. J Transl Med. 2020;18(1):488. DOI: 10.1186/s12967-020-02651-y. Link: https://pubmed.ncbi.nlm.nih.gov/33349261/
[9] Padoan A, Cosma C, Aita A, et al. Hyris bCUBE SARS-CoV-2 rapid molecular saliva testing: a POCT innovation on its way. Clin Chem Lab Med. 2022;60(5):766-770. DOI: 10.1515/cclm-2022-0008. Link: https://pubmed.ncbi.nlm.nih.gov/35041302/
[10] Su X, Fang Y, Liu H, et al. A Detection-Service-Mobile Three-Terminal Software Platform for Point-of-Care Infectious Disease Detection System. Biosensors (Basel). 2022;12(9):684. DOI: 10.3390/bios12090684. Link: https://pubmed.ncbi.nlm.nih.gov/36140069/
















